hands-on-llms  by iusztinpaul

LLM course for designing/deploying a real-time financial advisor

Created 2 years ago
3,391 stars

Top 14.3% on SourcePulse

GitHubView on GitHub
Project Summary

This repository provides a free, hands-on course for learning about Large Language Models (LLMs), LLMOps, and vector databases. It guides users through designing, training, and deploying a real-time financial advisor LLM system, targeting engineers and researchers interested in practical GenAI implementation.

How It Works

The course utilizes a three-pipeline architecture: a training pipeline for fine-tuning an LLM with QLoRA on a proprietary dataset, logging experiments with Comet ML, and deploying via Beam; a streaming pipeline for ingesting financial news from Alpaca, embedding it with Bytewax, and storing it in Qdrant; and an inference pipeline using LangChain to query the vector DB, enhance prompts with context, and serve financial advice via a Gradio UI, also deployed on Beam.

Quick Start & Requirements

Health Check
Last Commit

1 year ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
9 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.