Rust LLM ecosystem list
Top 68.2% on sourcepulse
This repository curates Rust tools, libraries, and frameworks for working with Large Language Models (LLMs), GPT, and AI. It serves as a comprehensive resource for developers and researchers seeking to leverage Rust's performance and safety for AI applications, offering a wide array of models, inference engines, and related utilities.
How It Works
The list categorizes Rust projects based on their functionality within the LLM ecosystem. This includes libraries for running LLM inference (e.g., llm
, rust-bert
, rllama
), frameworks for chaining LLM calls (llm-chain
), and tools for specific tasks like web browsing (browser-agent
) or code generation (autorust
). It also highlights vector databases (pgvecto.rs
, qdrant
) and memory solutions (indexify
, motorhead
) crucial for LLM applications.
Quick Start & Requirements
Highlighted Details
rllama
: Pure Rust implementation of LLaMa inference, suitable for embedding.pgvecto.rs
: Postgres extension for vector storage, claiming 20x speed over pgvector
.tiktoken-rs
: Rust implementation of OpenAI's BPE tokenizer, with a Rust core.rust-bert
: Rust port of Hugging Face's transformers
library for local embeddings.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The list is a curated collection, not a single integrated framework. Users must evaluate and integrate individual projects, and some projects may be experimental or have limited documentation.
1 year ago
Inactive