LLM Ops course for building production RAG systems
Top 95.2% on sourcepulse
This repository provides educational materials and code examples for learning Large Language Model Operations (LLM Ops). It targets Generative AI practitioners aiming to build, deploy, and operate production-ready LLM applications using frameworks like LangChain and LlamaIndex. The primary benefit is a structured, video-guided curriculum covering RAG systems, agents, fine-tuning, deployment, and observability.
How It Works
The course follows a modular approach, with each session focusing on specific LLM Ops concepts and tools. It demonstrates building end-to-end Retrieval Augmented Generation (RAG) systems, leveraging LangChain and LlamaIndex for data ingestion, querying, and agentic behavior. The curriculum emphasizes practical application, guiding users through deploying models like Llama 2 with FastAPI and integrating observability tools like Weights & Biases (WandB) and LangSmith.
Quick Start & Requirements
pip install
for Python dependencies.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The course content and code are from late 2023 and may require updates for full functionality due to the rapid evolution of LLM technologies.
1 year ago
Inactive