shell-ai  by ibigio

AI shell assistant for generating commands and code snippets

Created 3 years ago
429 stars

Top 69.1% on SourcePulse

GitHubView on GitHub
Project Summary

ShellAI is an AI-powered command-line assistant designed to help developers and power users quickly find shell commands, code snippets, and explanations without leaving the terminal. It aims to significantly reduce the time spent searching for information online, offering a minimal and convenient user experience.

How It Works

ShellAI leverages large language models (LLMs) to interpret natural language queries and generate relevant shell commands or code snippets. It features a fast, syntax-highlighted interface, automatically extracts and copies code, and allows for follow-up questions to refine results. The system supports OpenAI's GPT models and offers extensibility for other providers and local LLMs via a configurable config.yaml file.

Quick Start & Requirements

  • Install: Homebrew (brew install shell-ai) or script (curl ... | bash).
  • Prerequisites: OpenAI API key (or other LLM endpoint configuration).
  • Configuration: Set OPENAI_API_KEY environment variable. Advanced configuration for local models or Azure OpenAI is available via q config and direct file editing.
  • Docs: Custom Model Configuration

Highlighted Details

  • Supports GPT-3.5 and GPT-4, with extensibility for local OSS models (e.g., via llama.cpp).
  • Features auto-extraction and clipboard copying of generated code.
  • Includes a q config revert command to restore previous configurations.
  • Allows customization of prompts and model endpoints in ~/.shell-ai/config.yaml.

Maintenance & Community

The project is maintained by @ilanbigio. Future development focuses on building a comprehensive configuration TUI and setting up model install templates.

Licensing & Compatibility

The repository does not explicitly state a license in the provided README.

Limitations & Caveats

Configuration for local models requires manual setup of LLM inference servers (e.g., llama.cpp) and careful prompt engineering. The configuration TUI is still under development, necessitating direct file editing for advanced setups.

Health Check
Last Commit

1 month ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
1
Star History
5 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.