Discover and explore top open-source AI tools and projects—updated daily.
timmyy123On-device AI assistant for mobile chat and generation
Top 92.6% on SourcePulse
LLM Hub is an open-source Android application for on-device LLM chat and image generation, prioritizing user privacy and offline functionality. It targets mobile users seeking to run powerful models locally, offering optimized CPU/GPU/NPU acceleration and a suite of integrated AI tools. An iOS version is planned, contingent on funding for necessary hardware and developer accounts.
How It Works
Built with Kotlin and Jetpack Compose, the app leverages MediaPipe, LiteRT, and Nexa SDK for LLM inference, and MNN/Qualcomm QNN for image generation. It supports diverse model formats including .task, .litertlm, .gguf, and .onnx, with quantization support for INT4/INT8. This architecture enables 100% on-device processing, ensuring zero data collection and complete user privacy without internet connectivity requirements.
Quick Start & Requirements
For Android, download from Google Play or build from source using Android Studio and Gradle (./gradlew assembleDebug). iOS development requires macOS with Xcode installed and an Apple ID. Clone the repository and open the Xcode project (ios/LLMHub/LLMHub.xcodeproj). Development setup may involve configuring a Hugging Face token (HF_TOKEN) and a debug premium flag (DEBUG_PREMIUM) in android/local.properties.
Highlighted Details
creAItor), an on-device coding environment (Vibe Coder), writing assistance, Stable Diffusion image generation, offline translation with OCR, on-device speech-to-text transcription, and a scam detection tool.Maintenance & Community
The project is actively developed by a single author, seeking sponsorship for iOS development. Support and commercial licensing inquiries can be directed via email. Community interaction is facilitated through GitHub Issues and Discussions.
Licensing & Compatibility
The source code is licensed under the PolyForm Noncommercial License 1.0.0. This permits free use, study, and modification for non-commercial purposes. Commercial use, including distribution or monetization, requires explicit written permission from the author.
Limitations & Caveats
The iOS version is under development and requires funding for completion. GGUF model compatibility can be inconsistent across Android devices due to SDK dependencies, and NPU acceleration for GGUF is limited to high-end hardware. Manual license acceptance for certain HuggingFace models is necessary during local development builds.
2 days ago
Inactive