talk-codebase  by rsaryev

CLI tool for chatting with a codebase and docs

created 2 years ago
517 stars

Top 61.5% on sourcepulse

GitHubView on GitHub
Project Summary

This tool enables conversational interaction with codebases and documentation using LLMs, supporting both privacy-focused offline processing with LlamaCpp/GPT4All and cloud-based OpenAI integration. It's designed for developers and researchers seeking to query and understand their projects through natural language.

How It Works

The tool processes local files and code repositories, indexing them for LLM interaction. It offers a choice between local LLM inference (LlamaCpp, GPT4All) for enhanced privacy and reduced cost, or OpenAI's API for potentially higher quality responses. Configuration is managed via a YAML file, allowing customization of ignored files and model choices.

Quick Start & Requirements

  • Install via pip: pip install talk-codebase
  • Requires Python 3.8.1+ and a Git repository.
  • OpenAI API key needed for cloud-based models.
  • Configuration can be reset with talk-codebase configure.
  • Supported file types include .csv, .doc, .docx, .epub, .md, .pdf, .txt, and popular programming languages.

Highlighted Details

  • Supports offline LLM inference for privacy.
  • Integrates with OpenAI, LlamaCpp, and GPT4All.
  • Handles a variety of document and code file types.
  • Configuration is customizable via ~/.config.yaml.

Maintenance & Community

The project is under active development and welcomes bug reports and feature suggestions via its issue tracker.

Licensing & Compatibility

The README does not specify a license. Compatibility for commercial use or closed-source linking is not detailed.

Limitations & Caveats

The project is explicitly stated to be under development and recommended for educational purposes, not production use.

Health Check
Last commit

8 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
8 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.