Discover and explore top open-source AI tools and projects—updated daily.
floneumRust crates for local/remote AI model applications
Top 21.6% on SourcePulse
Floneum provides a Rust-based ecosystem for developing applications with local or remote AI models, targeting developers and researchers. It simplifies interaction with various AI modalities (text, audio, image) and offers a graphical editor for AI workflows, enabling efficient local AI application development.
How It Works
Floneum leverages the candle machine learning library for pure Rust model execution, supporting quantized and accelerated models. Its core, Kalosm, offers a unified interface for diverse models like Llama, Mistral, and Whisper. A key innovation is structured generation, allowing Rust types with #[derive(Parse, Schema)] to constrain model output to specific formats (JSON, regex patterns), enhancing data integrity and control.
Quick Start & Requirements
cargo new floneum-app && cd floneum-appcargo add kalosm --features language (add metal, cuda, or mkl for acceleration) and cargo add tokio --features full.cargo run --releaseHighlighted Details
llama.cpp on Metal (M2): 39 t/s vs 27 t/s for Mistral 7b.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
18 hours ago
1 day
openai
guillaume-be
huggingface