CLI tool for adding AI to command-line pipelines
Top 12.6% on sourcepulse
Mods AI enhances command-line pipelines by integrating Large Language Models (LLMs) to process command output. It allows users to query, format, and transform text-based results from standard input into formats like Markdown or JSON, making command-line workflows "artificially intelligent." The tool is designed for developers and power users seeking to leverage AI within their existing shell environments.
How It Works
Mods functions by taking standard input, prepending a user-defined prompt, and sending this combined text to a specified LLM. It then outputs the LLM's response, with options to format it and control various generation parameters. This approach enables users to interact with command output conversationally, asking questions or requesting specific transformations directly within their terminal. It supports multiple LLM providers, including local options like LocalAI and cloud services like OpenAI, Cohere, Groq, and Azure OpenAI.
Quick Start & Requirements
brew install charmbracelet/tap/mods
(macOS/Linux), winget install charmbracelet.mods
(Windows), yay -S mods
(Arch), nix-shell -p mods
(Nix), or via package managers for Debian/Ubuntu and Fedora/RHEL. Can also be installed with go install github.com/charmbracelet/mods@latest
.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The tool relies on external LLM APIs or local LLM setups, meaning performance and availability are dependent on these configurations. While it supports various LLM providers, specific model compatibility or optimal performance may vary.
2 days ago
1+ week