maid  by Mobile-Artificial-Intelligence

Cross-platform Flutter app for local/remote AI model interfacing

created 1 year ago
2,099 stars

Top 21.8% on sourcepulse

GitHubView on GitHub
Project Summary

Maid is a free, open-source, cross-platform Flutter application designed for interacting with large language models. It supports local GGUF models via llama.cpp, and remote APIs from Ollama, Mistral, Google Gemini, and OpenAI. The app also facilitates character card interactions using SillyTavern formats and allows in-app model downloads from Hugging Face, targeting users who want a unified interface for diverse AI models on their devices.

How It Works

Maid leverages Flutter for its cross-platform UI, enabling a consistent experience across desktop and mobile operating systems. For local model inference, it integrates with llama.cpp, a highly optimized C++ inference engine known for its efficiency on consumer hardware. Remote interactions are handled through API clients for Ollama, Mistral, Gemini, and OpenAI. The use of SillyTavern character card formats allows for easy integration with existing character ecosystems.

Quick Start & Requirements

  • Installation: Clone the repository (git clone https://github.com/Mobile-Artificial-Intelligence/maid.git). Note that Flutter is included as a submodule for reproducible builds; for development, consider using a local Flutter install and deiniting the submodule (git submodule deinit -f packages/flutter).
  • Dependencies:
    • Fedora: sudo dnf install -y cmake ninja-build pkg-config gtk3-devel vulkan-devel
    • Debian: sudo apt-get install -y cmake ninja-build pkg-config libgtk-3-dev libvulkan-dev
  • Platform Support: Windows, macOS, Linux, Android. iOS releases are not yet available.
  • Resources: Tested with various models including calypso 3b, orcamini 3b, minyllama 1.1b, phi 3, mistral 7b, mixtral 8x7b, llama 2 7B-Chat, and llama 7B.

Highlighted Details

  • Cross-platform support for Windows, macOS, Linux, and Android.
  • Local inference via llama.cpp and remote API support for Ollama, Mistral, Gemini, and OpenAI.
  • Compatibility with SillyTavern character card formats.
  • In-app model downloading from Hugging Face.

Maintenance & Community

The project is actively maintained, with CI/CD pipelines for Android, iOS, Linux, macOS, Windows, and Web builds. Key related projects include ggerganov/llama.cpp and davidmigloz/langchain_dart.

Licensing & Compatibility

Licensed under the MIT License. This permissive license allows for commercial use and integration into closed-source projects.

Limitations & Caveats

iOS releases are not currently available. The project is distributed without warranty.

Health Check
Last commit

6 days ago

Responsiveness

1 day

Pull Requests (30d)
1
Issues (30d)
1
Star History
207 stars in the last 90 days

Explore Similar Projects

Starred by Addy Osmani Addy Osmani(Engineering Leader on Google Chrome), Victor Taelin Victor Taelin(Author of Bend, Kind, HVM), and
1 more.

chatbox by chatboxai

0.3%
36k
Desktop client app for AI models/LLMs
created 2 years ago
updated 6 days ago
Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Pietro Schirano Pietro Schirano(Founder of MagicPath), and
1 more.

SillyTavern by SillyTavern

3.2%
17k
LLM frontend for power users
created 2 years ago
updated 3 days ago
Feedback? Help us improve.