raycast-ai-openrouter-proxy  by miikkaylisiurunen

Raycast AI proxy for custom models

Created 10 months ago
269 stars

Top 95.4% on SourcePulse

GitHubView on GitHub
Project Summary

<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.> This project provides a proxy server enabling Raycast AI to integrate with any OpenAI-compatible API, offering "Bring Your Own Key" (BYOK) functionality. It targets Raycast users seeking to leverage custom models from providers like OpenRouter, Gemini, or OpenAI directly within Raycast's AI features without requiring a Pro subscription, thereby enhancing flexibility and model choice.

How It Works

The proxy acts as an intermediary, translating Raycast's Ollama-like requests into the OpenAI-compatible format required by various LLM providers. Deployed via Docker Compose, its behavior is configured through docker-compose.yml for API endpoints and keys, and models.json for defining model specifics like names, IDs, and capabilities. This approach allows users to bypass Raycast's native provider limitations and utilize a broader spectrum of AI models.

Quick Start & Requirements

  • Primary install / run command: Clone the repository, configure docker-compose.yml and models.json with your API key and provider details, then run docker compose up -d --build. Set Raycast's Ollama Host to localhost:11435 (or your configured port).
  • Non-default prerequisites and dependencies: Docker, an API key for your chosen provider, and Raycast.
  • Links: Repository: https://github.com/miikkaylisiurunen/raycast-ai-openrouter-proxy

Highlighted Details

  • Supports any OpenAI-compatible API provider, defaulting to OpenRouter.
  • Enables BYOK for Raycast AI features (Chat, Commands, Presets) without a Raycast Pro subscription.
  • Features include vision support, tool calling, system instructions, streaming responses, and automatic chat title generation.
  • Model configurations, including context length and provider-specific parameters, are managed via models.json.

Maintenance & Community

The project is marked as "Work In Progress" with a warning about potential bugs. No specific details regarding contributors, sponsorships, or community channels (like Discord/Slack) are provided in the README.

Licensing & Compatibility

The project's license is not explicitly stated in the README. Compatibility is focused on Raycast AI. Remote deployment is possible but discouraged due to the lack of authentication.

Limitations & Caveats

This project is under active development and may contain bugs. Remote tools such as web search and image generation are not supported. Displaying the model's thinking process is provider-dependent and may require additional configuration. Security is a concern for remote deployments as no authentication is implemented.

Health Check
Last Commit

10 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
3 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.