Discover and explore top open-source AI tools and projects—updated daily.
rustformersRust ecosystem for LLM Rust inference (unmaintained)
Top 8.3% on SourcePulse
This project provides an ecosystem of Rust libraries for working with large language models (LLMs), built on the GGML tensor library. It targets developers and end-users seeking efficient, Rust-native LLM inference, offering a CLI for direct interaction and a crate for programmatic use.
How It Works
The core of the project leverages the GGML tensor library, aiming to bring Rust's robustness and ease of use to LLM inference. It supports various model architectures and quantization methods, with an initial focus on CPU inference, though GPU acceleration (CUDA, Metal) was a planned feature.
Quick Start & Requirements
cargo install --git https://github.com/rustformers/llm llm-cliHighlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is archived and no longer actively maintained. The released version (0.1.1) is significantly out of date. The main and gguf branches are also outdated and do not support GGUF or the latest GGML versions. The develop branch, intended to sync with the latest GGML and support GGUF, was not completed.
1 year ago
Inactive
bigcode-project
monatis
guillaume-be
meta-pytorch
google
ggml-org