Discover and explore top open-source AI tools and projects—updated daily.
michaelzixizhouVisualize AI/LLM code workflows
Top 79.7% on SourcePulse
Codag visualizes complex AI/LLM workflows directly within VSCode, transforming code analysis for AI engineers and agent builders. It automatically maps LLM API calls, decision branches, and processing steps across codebases, generating interactive graphs that link directly to source code, significantly reducing debugging time and simplifying onboarding onto intricate AI projects.
How It Works
Codag employs a multi-stage analysis pipeline. Tree-sitter parses code into ASTs across 10+ languages. Pattern matching detects LLM API calls and framework usage, followed by call graph extraction. A backend service leverages Gemini 2.5 Flash to semantically interpret structures, identifying workflow nodes, edges, and decision points. Live updates are enabled by incremental Tree-sitter re-parsing and AST diffing, reflecting code changes instantly without full LLM re-analysis. ELK and D3.js render interactive, theme-aware graphs.
Quick Start & Requirements
backend/.env.example with a Gemini API key, and run docker compose up -d. Alternatively, set up a Python 3.11 virtual environment and run python main.py.Highlighted Details
Maintenance & Community
The project welcomes Pull Requests. A roadmap includes a hosted backend and Git commit diff views. Contact is available via michael@codag.ai. No specific community channels (e.g., Discord/Slack) or notable contributors are listed.
Licensing & Compatibility
Limitations & Caveats
The project currently requires users to self-host the analysis backend and provide their own Gemini API key. Key features like a hosted backend and Git diff comparison are still on the roadmap. Adding support for new LLM providers or frameworks requires manual code contributions.
1 week ago
Inactive
Aider-AI