Open-source LLM frontend for local character chat
Top 64.0% on sourcepulse
This project provides a local, open-source frontend for interacting with various Large Language Models (LLMs), aiming to be a desktop alternative to services like CharacterAI. It targets users who want a private, customizable chat experience with AI characters, supporting multiple API backends and offering features like character card management and theme customization.
How It Works
The application is built using Electron, providing a desktop experience. It acts as a unified interface to interact with LLM APIs, including OpenAI, Anthropic, Mistral, and Together AI, as well as any OpenAI-compatible endpoints like Groq or Ollama. Users configure their chosen API provider, API key, and model within the application's settings to initiate chats.
Quick Start & Requirements
Highlighted Details
Maintenance & Community
The project is actively developed, with recent releases including Linux and macOS binaries and auto-updater functionality. A Discord server is available for community support, bug reporting, and feature requests.
Licensing & Compatibility
The project does not explicitly state a license in the provided README. This requires further investigation for commercial use or integration into closed-source projects.
Limitations & Caveats
The application is in Alpha Early Access (v0.0.1™®). Theme editing directly within the app is a Work In Progress (WIP) feature, requiring manual CSS edits in the source code for customization. Local inference capabilities are planned but not yet implemented.
1 year ago
1 week