Discover and explore top open-source AI tools and projects—updated daily.
Fast-EditorUniversal LLM proxy for AI coding tools
Top 88.1% on SourcePulse
Lynkr is a self-hosted HTTP proxy that unifies AI coding tools like Cursor and Claude Code CLI with diverse LLM providers. It targets developers and enterprises seeking flexible, cost-effective (60-80% reduction), and private AI interactions, acting as a universal LLM interface.
How It Works
Lynkr functions as a drop-in backend replacement, intercepting AI tool requests and routing them to over 10 local or cloud LLM providers. Its architecture emphasizes efficiency through token optimization, prompt caching, and memory deduplication, enabling significant cost savings and allowing 100% local/private execution via options like Ollama and llama.cpp.
Quick Start & Requirements
Installation is recommended via NPM (npm install -g lynkr or npx lynkr). Alternatives include cloning the repo (npm install, npm start) or using Docker (docker-compose up -d). Prerequisites include Node.js for NPM installations. Detailed guides for setup and provider configuration are available.
Highlighted Details
Maintenance & Community
Developed "by developers, for developers," community support is available via GitHub Discussions and the Issues tracker. No specific notable contributors, sponsorships, or partnerships are highlighted.
Licensing & Compatibility
Distributed under the Apache 2.0 license, permitting commercial use and integration into closed-source projects.
Limitations & Caveats
The project focuses on proxying and optimization; specific performance benchmarks beyond claimed cost reductions are not detailed. MLX integration is limited to Apple Silicon hardware. Users should be aware of ongoing development.
16 hours ago
Inactive
TabbyML