Proxy worker for local LLM use in Cursor editor
Top 77.3% on SourcePulse
This project provides a proxy worker for integrating Ollama with the Cursor IDE, enabling local LLM inference for Cursor's AI features. It addresses the limitation where Cursor's default server configuration prevents direct communication with local Ollama instances, offering a solution for users who want to leverage their own models.
How It Works
Curxy acts as an intermediary server, forwarding requests from the Cursor IDE to a running Ollama instance. It then relays the Ollama response back to Cursor. This approach bypasses Cursor's default server routing, allowing seamless integration with locally hosted LLMs. The project utilizes Deno for its runtime environment.
Quick Start & Requirements
deno run -A jsr:@ryoppippi/curxy
https://your-curxy-url.trycloudflare.com/v1
). Add desired model names to Cursor's configuration.OPENAI_API_KEY
environment variable for access restriction.Highlighted Details
OPENAI_API_KEY
environment variable.Maintenance & Community
No specific information on contributors, sponsorships, or community channels is provided in the README.
Licensing & Compatibility
Limitations & Caveats
The project is described as a "simple proxy worker," implying potential limitations in advanced features or robustness compared to more comprehensive solutions. The reliance on Cloudflare Tunnel for public URLs might introduce external dependencies or potential points of failure.
1 month ago
1 week