tlm  by yusufcanb

Local CLI copilot using Ollama for command-line assistance

Created 1 year ago
1,449 stars

Top 28.2% on SourcePulse

GitHubView on GitHub
Project Summary

tlm provides local, offline command-line assistance powered by open-source LLMs via Ollama. It targets developers and power users seeking an alternative to cloud-based AI assistants, offering features like command suggestion, explanation, and context-aware Q&A without requiring API keys or internet connectivity.

How It Works

tlm integrates with Ollama to leverage various open-source models (e.g., Llama 3, Phi-4, DeepSeek-R1, Qwen) directly on the user's machine. It supports automatic shell detection for seamless integration and offers a Retrieval Augmented Generation (RAG) capability for context-aware queries, allowing users to provide local file paths or patterns for more relevant responses.

Quick Start & Requirements

  • Installation: Recommended via script (curl ... | sudo -E bash for Linux/macOS, Invoke-RestMethod ... | Invoke-Expression for Windows PowerShell) or go install github.com/yusufcanb/tlm@1.2 if Go 1.22+ is installed.
  • Prerequisites: Ollama must be installed and running locally.
  • Resources: Requires sufficient local resources to run LLMs.
  • Docs: Usage examples are provided within the README.

Highlighted Details

  • No API keys or internet connection required.
  • Supports automatic shell detection (Bash, Zsh, PowerShell).
  • Offers one-liner command generation and explanation.
  • Features "no-brainer RAG" for context-aware Q&A.
  • Allows experimentation with various Ollama-compatible models and parameters.

Maintenance & Community

The project is maintained by yusufcanb. No specific community channels or roadmap links are provided in the README.

Licensing & Compatibility

The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The ask command is marked as beta. The README does not detail specific model performance benchmarks or provide explicit compatibility information for all operating systems and hardware configurations beyond general support for macOS, Linux, and Windows.

Health Check
Last Commit

5 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
7 stars in the last 30 days

Explore Similar Projects

Starred by Christian Laforte Christian Laforte(Distinguished Engineer at NVIDIA; Former CTO at Stability AI), Jason Knight Jason Knight(Director AI Compilers at NVIDIA; Cofounder of OctoML), and
1 more.

shell_sage by AnswerDotAI

0.3%
362
CLI tool for terminal context analysis using LLMs
Created 10 months ago
Updated 2 months ago
Feedback? Help us improve.