Discover and explore top open-source AI tools and projects—updated daily.
larsderidderLLM context window visualizer and cost optimizer
Top 84.9% on SourcePulse
Context Lens tackles LLM context window cost optimization for AI coding tools by providing a local proxy and visualizer. It breaks down API call composition, revealing where token budgets are spent, enabling developers to understand and reduce expenses without modifying tool code. This is crucial for users of closed-source tools like Claude Code, Codex, and Gemini CLI.
How It Works
A local proxy intercepts LLM API calls, forwarding them to the actual endpoints while capturing request/response data. An analysis server processes these captures to detail context composition (system prompts, history, tool results) and estimate costs. This transparent proxy architecture works framework-agnosticly with closed-source tools, offering context composition analysis beyond simple token counts.
Quick Start & Requirements
Install via npm (npm install -g context-lens), pnpm, or npx. Run with context-lens <tool_alias> (e.g., context-lens claude). Docker images are available. mitmproxy is required for specific tools (Codex subscription, Cline, Pi subscription models) needing HTTPS interception. The UI is accessible locally at http://localhost:4041.
Highlighted Details
Maintenance & Community
Team dashboards are planned; users can open issues for roadmap input. No explicit community channels are listed.
Licensing & Compatibility
Released under the MIT License, permitting commercial use. Designed for individual developers, with team features planned. Runs locally and privately.
Limitations & Caveats
Team features are under development. mitmproxy is a prerequisite for certain HTTPS-intercepting tools. Google Gemini support is experimental.
1 day ago
Inactive
PrefectHQ