Augment-BYOK  by AnkRoot

VS Code extension for routing LLM calls to custom providers

Created 1 month ago
285 stars

Top 92.0% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a single VS Code extension VSIX file that enables users to route Augment's 13 core LLM data endpoints to custom, Bring-Your-Own-Key (BYOK) providers. It targets Augment extension users seeking greater flexibility in LLM provider selection and API key management, offering centralized control and runtime rollback capabilities without external dependencies.

How It Works

The core approach involves patching the official Augment VS Code extension VSIX. It intercepts specific LLM data endpoints (e.g., /chat, /completion, /chat-stream) and redirects their traffic to user-configured BYOK providers. Other endpoints remain unaffected, maintaining official behavior. A runtime toggle allows users to enable or disable BYOK functionality seamlessly, reverting to the original Augment behavior without requiring external services or complex setups. Configuration is managed directly within VS Code's extension globalState.

Quick Start & Requirements

  • Installation: Download the .vsix file from GitHub Releases (look for the rolling tag) and install via VS Code's "Extensions: Install from VSIX..." command.
  • Configuration: Open the "BYOK: Open Config Panel" command, configure at least one LLM provider, and save.
  • Prerequisites (for local building): Node.js 20+, Python 3.
  • Documentation: Configuration details are in docs/CONFIG.md, endpoint coverage in dist/endpoint-coverage.report.md.

Highlighted Details

  • Single VSIX: All functionality is bundled into one file, eliminating the need for separate Rust or external proxy services.
  • BYOK Routing: Intercepts and routes 13 LLM data endpoints to custom providers, supporting streaming.
  • Runtime Toggle & Rollback: Easily enable/disable BYOK functionality with commands like BYOK: Enable and BYOK: Disable (Rollback).
  • Auditable Builds: Locks upstream versions and injects SHA256 hashes for key components, producing coverage reports.
  • Fail-Fast Builds: Build process fails immediately if upstream changes break patching logic or contract requirements.
  • Extensive Provider Support: Includes compatibility layers for OpenAI-compatible, Anthropic, and Gemini AI Studio endpoints.

Maintenance & Community

Specific details regarding maintainers, sponsorships, or community channels (like Discord/Slack) are not provided in the README. The repository's documentation, including its detailed README and docs/ROADMAP.md, serves as the primary source of information for ongoing development and planned features.

Licensing & Compatibility

The README does not explicitly state the project's license. As this project modifies another extension's VSIX, potential licensing conflicts or compatibility issues with the original Augment extension's license should be carefully reviewed before adoption, especially for commercial use.

Limitations & Caveats

The project explicitly excludes replicating Augment's control plane, permissions, secrets management, telemetry, or Remote Agents. It forbids autoAuth and does not support configuration via environment variables, YAML, or SecretStorage to maintain a single source of truth. Major upstream Augment VSIX updates may require manual review and updates to the patching logic, although the build process is designed to fail fast in such cases. Troubleshooting may be required for specific proxy implementations or SSE support variations across custom LLM providers. The "rolling" release tag suggests active development and potential instability.

Health Check
Last Commit

1 week ago

Responsiveness

Inactive

Pull Requests (30d)
4
Issues (30d)
22
Star History
91 stars in the last 30 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of "AI Engineering", "Designing Machine Learning Systems") and David Cramer David Cramer(Cofounder of Sentry).

llmgateway by theopenco

1.7%
874
LLM API gateway for unified provider access
Created 10 months ago
Updated 21 hours ago
Feedback? Help us improve.