learn-langchain  by paolorechia

AI agent playground using LangChain with LLama-based models

created 2 years ago
275 stars

Top 94.9% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides a playground for building AI agents using Langchain and Vicuna, a Llama-based LLM. It focuses on implementing Zero Shot/Few Shot prompts via the ReAct framework, enabling users to experiment with conversational AI and task execution.

How It Works

The project leverages the ReAct (Reasoning and Acting) framework, which combines language models with external tools. Agents can reason about a task, decide on an action (e.g., using a Python REPL, search engine), execute it, and then use the observation to refine their next step. It supports various Vicuna models, including quantized versions (4-bit GPTQ), and offers two backend options: oobabooga's Text Generation WebUI or a custom server with prompt logging.

Quick Start & Requirements

  • Installation: Execute chmod +x ./install_on_virtualenv_and_pip.sh && ./install_on_virtualenv_and_pip.sh or manually run pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 and pip3 install -r requirements.txt.
  • Prerequisites: NVIDIA Driver/Toolkit (CUDA 11.8 recommended), Python, Git LFS for quantized models.
  • Setup: Requires installing dependencies and potentially downloading large model files.
  • Docs: Medium Articles

Highlighted Details

  • Supports Vicuna 7b/13b models, including 4-bit GPTQ quantized versions.
  • Integrates with oobabooga's Text Generation WebUI for a user-friendly backend.
  • Demonstrates agent execution with Python REPL, web search, and Matplotlib chart generation.
  • Includes examples for specific tasks like generating cat jokes and answering questions about Germany.

Maintenance & Community

The project is maintained by paolorechia. Further community engagement details are not explicitly provided in the README.

Licensing & Compatibility

The repository itself does not specify a license. However, it utilizes and acknowledges other projects, including GPTQ-for-LLaMa and FastChat, which have their own licenses. Compatibility for commercial use or closed-source linking would depend on the licenses of these underlying components and the models used.

Limitations & Caveats

The README notes that coding prompts are currently unreliable and model-dependent. The project's custom web server option is not recommended due to open bugs. Windows installation instructions are basic, and some environment variable handling may require adaptation. The "Code Editor Tool / Code-it task executor" is experimental.

Health Check
Last commit

2 years ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
0 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.