mods  by charmbracelet

CLI tool for adding AI to command-line pipelines

created 2 years ago
3,954 stars

Top 12.6% on sourcepulse

GitHubView on GitHub
Project Summary

Mods AI enhances command-line pipelines by integrating Large Language Models (LLMs) to process command output. It allows users to query, format, and transform text-based results from standard input into formats like Markdown or JSON, making command-line workflows "artificially intelligent." The tool is designed for developers and power users seeking to leverage AI within their existing shell environments.

How It Works

Mods functions by taking standard input, prepending a user-defined prompt, and sending this combined text to a specified LLM. It then outputs the LLM's response, with options to format it and control various generation parameters. This approach enables users to interact with command output conversationally, asking questions or requesting specific transformations directly within their terminal. It supports multiple LLM providers, including local options like LocalAI and cloud services like OpenAI, Cohere, Groq, and Azure OpenAI.

Quick Start & Requirements

  • Installation: brew install charmbracelet/tap/mods (macOS/Linux), winget install charmbracelet.mods (Windows), yay -S mods (Arch), nix-shell -p mods (Nix), or via package managers for Debian/Ubuntu and Fedora/RHEL. Can also be installed with go install github.com/charmbracelet/mods@latest.
  • Prerequisites: API keys for chosen LLM providers (OpenAI, Azure OpenAI, Cohere, Groq, Google Gemini) or a locally running LLM compatible with OpenAI endpoints (e.g., LocalAI).
  • Setup: Installation via package managers is typically quick. Configuration involves setting environment variables for API keys or pointing to local endpoints.
  • Documentation: Examples and feature list available.

Highlighted Details

  • Supports multiple LLM providers including OpenAI, Azure OpenAI, Cohere, Groq, Gemini, and local options via LocalAI.
  • Enables conversational interaction with command output, allowing users to query and reformat results.
  • Features saved conversations with SHA-1 identifiers and titles for tracking history.
  • Offers custom role support for pre-defined system prompts, such as creating a "shell expert" role.

Maintenance & Community

  • Part of the Charm ecosystem.
  • Community engagement encouraged via Twitter and Discord.

Licensing & Compatibility

  • MIT License.
  • Compatible with commercial use and closed-source linking.

Limitations & Caveats

The tool relies on external LLM APIs or local LLM setups, meaning performance and availability are dependent on these configurations. While it supports various LLM providers, specific model compatibility or optimal performance may vary.

Health Check
Last commit

2 days ago

Responsiveness

1+ week

Pull Requests (30d)
13
Issues (30d)
18
Star History
471 stars in the last 90 days

Explore Similar Projects

Starred by Carol Willing Carol Willing(Core Contributor to CPython, Jupyter), Georgios Konstantopoulos Georgios Konstantopoulos(CTO, General Partner at Paradigm), and
13 more.

llm by simonw

1.3%
9k
CLI tool and Python library for LLM interaction
created 2 years ago
updated 1 month ago
Feedback? Help us improve.