Self-hosted AI interface with document chats, YouTube chats, and more
Top 74.0% on sourcepulse
Abbey is a self-hosted, configurable AI interface designed for individuals and teams to interact with various AI models. It offers features like document and YouTube chats, workspaces, and supports multiple LLM, TTS, and OCR providers, enabling private, customized AI workflows.
How It Works
Abbey utilizes a Docker-based architecture, orchestrating backend services, a frontend, and a database. Users configure their preferred AI models (LLMs, embedding, TTS, OCR) and integrations (search engines, authentication, storage) via .env
and settings.yml
files. This declarative configuration allows Abbey to dynamically integrate with a wide range of AI services, including OpenAI, Anthropic, Ollama, and various OpenAI-compatible APIs, providing flexibility and extensibility.
Quick Start & Requirements
docker compose up
(development) or docker compose -f docker-compose.prod.yml up --build
(production)..env
for secrets (API keys, database password) and settings.yml
for model and integration configuration.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
3 months ago
1 day