Rust desktop app for local/cloud LLM access and AI agent integration
Top 88.0% on sourcepulse
Moly is a desktop and cloud AI LLM GUI client built in pure Rust, targeting developers and power users who want a unified interface for various AI providers, including local LLMs via its Moly Server and AI agents via the MoFa framework. It offers a cross-platform experience with pre-built releases for macOS, Linux, and Windows, simplifying local LLM interaction and AI agent integration.
How It Works
Moly leverages the Makepad UI toolkit for its cross-platform graphical interface and Project Robius for application development. It supports OpenAI-compatible APIs and integrates with Moly Server, a local LLM runner, and MoFa, a framework for building AI agents exposed via Dora servers. This architecture allows for flexible integration of diverse AI backends within a single, native application.
Quick Start & Requirements
cargo run --release
.openssl
, clang
, binfmt
, Xcursor
/X11
, asound
/pulse
. macOS and Windows may require specific build tools. Moly Server requires downloading an executable or compiling from source. MoFa integration requires Python 3.10+, Dora, and specific Python libraries.Highlighted Details
.app
, .dmg
, .deb
, AppImage, and .exe
installers.supported_providers.json
.Maintenance & Community
The project is in early development. Community interaction channels are not explicitly listed in the README.
Licensing & Compatibility
The project appears to be licensed under the MIT License. Compatibility for commercial use is generally permissive under MIT.
Limitations & Caveats
Moly is in early development, with potential for bugs and unexpected behavior. The pacman package for Linux has not yet been tested. Specific macOS packaging steps require granting "App Management" permissions.
2 days ago
Inactive