TUI for managing and using local/cloud LLMs
Top 80.8% on sourcepulse
PAR LLAMA is a TUI application for managing and interacting with Large Language Models (LLMs), primarily targeting Ollama but also supporting major cloud providers. It offers a user-friendly interface for chatting, managing models, and configuring sessions, benefiting users who prefer a terminal-based experience for LLM interaction.
How It Works
Built with Textual and Rich, PAR LLAMA provides a sophisticated Text User Interface. It leverages PAR AI Core for its backend logic. The application supports various LLM providers, including Ollama (local and remote instances), OpenAI, Anthropic, Groq, Google, and others via LiteLLM. It features a tabbed interface for managing different chats, models, and settings, with extensive command-line arguments and environment variable support for customization.
Quick Start & Requirements
uv tool install parllama
or pipx install parllama
. Development install: make setup
(requires uv
and make
).Highlighted Details
/add.image
.Maintenance & Community
The project is actively maintained with frequent updates, as evidenced by the detailed changelog. Community interaction points are not explicitly listed in the README, but contribution guidelines are provided.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility is noted for Windows 11 x64, Windows WSL x64, Mac OSX (Intel and Silicon), and Linux.
Limitations & Caveats
Docker is only required for HuggingFace model quantization. Token counting may not always be 100% accurate. Some remote Ollama API features, like CPU/GPU percentage, are not available.
1 week ago
1 day