Discover and explore top open-source AI tools and projects—updated daily.
Vali-98Mobile app frontend for LLMs
Top 22.8% on SourcePulse
ChatterUI is a native mobile application designed for interacting with Large Language Models (LLMs). It offers a user-friendly interface for both on-device inference using llama.cpp and connecting to various commercial and open-source LLM APIs, catering to mobile users seeking flexible LLM access and chat customization.
How It Works
ChatterUI leverages llama.cpp for on-device LLM execution, utilizing a custom React Native adapter (cui-llama.rn) for integration. This approach allows users to run quantized GGUF models directly on their mobile devices, offering a private and efficient LLM experience. For remote access, it supports numerous APIs including OpenAI, Claude, Cohere, Ollama, and text-generation-webui, with a flexible template system for custom API integrations.
Quick Start & Requirements
Highlighted Details
llama.cpp with GGUF models.Maintenance & Community
llama.cpp and cui-llama.rn.Licensing & Compatibility
Limitations & Caveats
iOS support is currently unavailable due to a lack of development hardware. Building for Android requires a Linux environment for EAS builds.
1 week ago
1 day
SillyTavern
oobabooga