HTTP/2 proxy for using DeepSeek, OpenRouter, and Ollama models with Cursor IDE
Top 57.0% on sourcepulse
This project provides a high-performance HTTP/2 proxy server designed to bridge Cursor IDE's Composer and other OpenAI-compatible clients with alternative language models like DeepSeek, OpenRouter, and Ollama. It translates API requests, enabling users to leverage these models for AI-assisted coding and development without being tied to OpenAI's ecosystem.
How It Works
The proxy server is built with Go and leverages HTTP/2 for efficient communication. It intercepts requests formatted for the OpenAI API and transforms them into the specific formats required by DeepSeek, OpenRouter, or Ollama. Key features include full CORS support, streaming responses, function calling translation, automatic message format conversion, and compression support. This approach allows seamless integration with existing tools that expect an OpenAI-compatible endpoint.
Quick Start & Requirements
docker build -t cursor-deepseek .
(or use --build-arg PROXY_VARIANT=openrouter
or ollama
)docker run -p 9000:9000 --env-file .env cursor-deepseek
go run proxy.go
.env
file. Exposing the endpoint publicly (e.g., via ngrok) is recommended for external access.Highlighted Details
gpt-4o
to DeepSeek's GPT-4o equivalent.Maintenance & Community
The project is maintained by danilofalcao. Further community or roadmap information is not detailed in the README.
Licensing & Compatibility
Limitations & Caveats
The project requires a Cursor Pro subscription for its primary use case. While it supports multiple model providers, the README does not detail performance benchmarks or specific model compatibility beyond the primary mappings.
5 months ago
1 day