Mobile app frontend for LLMs
Top 26.0% on sourcepulse
ChatterUI is a native mobile application designed for interacting with Large Language Models (LLMs). It offers a user-friendly interface for both on-device inference using llama.cpp
and connecting to various commercial and open-source LLM APIs, catering to mobile users seeking flexible LLM access and chat customization.
How It Works
ChatterUI leverages llama.cpp
for on-device LLM execution, utilizing a custom React Native adapter (cui-llama.rn
) for integration. This approach allows users to run quantized GGUF models directly on their mobile devices, offering a private and efficient LLM experience. For remote access, it supports numerous APIs including OpenAI, Claude, Cohere, Ollama, and text-generation-webui, with a flexible template system for custom API integrations.
Quick Start & Requirements
Highlighted Details
llama.cpp
with GGUF models.Maintenance & Community
llama.cpp
and cui-llama.rn
.Licensing & Compatibility
Limitations & Caveats
iOS support is currently unavailable due to a lack of development hardware. Building for Android requires a Linux environment for EAS builds.
3 days ago
1 day