Agentic app examples built on Llama Stack
Top 11.7% on sourcepulse
This repository provides example applications built on the Llama Stack, enabling developers to create agentic AI applications. It targets developers building generative AI applications who need multi-step reasoning, tool usage, and safety features, offering a standardized way to integrate Llama models, Llama Guard, and tool execution environments.
How It Works
The Llama Stack defines standardized APIs for core generative AI application components, including model inference, safety checks (via Llama Guard), and tool execution. Applications interact with a Llama Stack server, which orchestrates these components. This approach simplifies development by providing a unified distribution that bundles necessary functionalities, allowing agents to break down tasks, use tools (like code interpreters or search engines), and leverage built-in safety mechanisms.
Quick Start & Requirements
conda create -n stack python=3.10
, conda activate stack
) and install dependencies (pip install -r requirements.txt
).TAVILY_SEARCH_API_KEY
), WolframAlpha (WOLFRAM_ALPHA_API_KEY
), and Brave Search are optional for specific tools.llama-stack
repository's guide. The server typically listens on http://localhost:8321
.python -m examples.agents.hello localhost 8321
or python -m examples.agents.rag_with_vector_db localhost 8321
.Highlighted Details
Maintenance & Community
The project is from Meta AI. Further community and roadmap details are not explicitly provided in this README.
Licensing & Compatibility
The README does not specify a license for llama-stack-apps
. The underlying Llama models have their own licenses.
Limitations & Caveats
Distribution installation requires Conda; while venv can be used for running apps, it's not sufficient for initial setup. Specific tool integrations may require external API keys.
3 months ago
1 week