parllama  by paulrobello

TUI for managing and using local/cloud LLMs

created 1 year ago
349 stars

Top 80.8% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

PAR LLAMA is a TUI application for managing and interacting with Large Language Models (LLMs), primarily targeting Ollama but also supporting major cloud providers. It offers a user-friendly interface for chatting, managing models, and configuring sessions, benefiting users who prefer a terminal-based experience for LLM interaction.

How It Works

Built with Textual and Rich, PAR LLAMA provides a sophisticated Text User Interface. It leverages PAR AI Core for its backend logic. The application supports various LLM providers, including Ollama (local and remote instances), OpenAI, Anthropic, Groq, Google, and others via LiteLLM. It features a tabbed interface for managing different chats, models, and settings, with extensive command-line arguments and environment variable support for customization.

Quick Start & Requirements

  • Installation: uv tool install parllama or pipx install parllama. Development install: make setup (requires uv and make).
  • Prerequisites: Python 3.11+, Ollama (for Ollama integration), Docker (for HuggingFace model quantization).
  • Setup: Minimal setup for basic use; Ollama installation and configuration may add time.
  • Docs: https://github.com/paulrobello/parllama

Highlighted Details

  • Supports chat with vision-capable LLMs using slash commands like /add.image.
  • Extensive customization via command-line arguments and environment variables for themes, data directories, and provider URLs.
  • Features a custom prompt library, with import support from the Fabric project.
  • Includes session management, allowing users to save, recall, and compare conversations across different models and configurations.

Maintenance & Community

The project is actively maintained with frequent updates, as evidenced by the detailed changelog. Community interaction points are not explicitly listed in the README, but contribution guidelines are provided.

Licensing & Compatibility

The README does not explicitly state a license. Compatibility is noted for Windows 11 x64, Windows WSL x64, Mac OSX (Intel and Silicon), and Linux.

Limitations & Caveats

Docker is only required for HuggingFace model quantization. Token counting may not always be 100% accurate. Some remote Ollama API features, like CPU/GPU percentage, are not available.

Health Check
Last commit

1 week ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
2
Star History
39 stars in the last 90 days

Explore Similar Projects

Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems).

LangBot by langbot-app

0.9%
13k
IM bot platform for the LLM era
created 2 years ago
updated 5 days ago
Feedback? Help us improve.