ChatLLM  by yuanjie-ai

SDK for LLM-powered applications, compatible with OpenAI & LangChain

created 2 years ago
446 stars

Top 68.4% on sourcepulse

GitHubView on GitHub
Project Summary

ChatLLM is a Python library designed to simplify the use of Large Language Models (LLMs), particularly for Chinese users. It provides a unified interface to various domestic LLMs (like ChatGLM, Wenxin Yiyan, Spark, Hunyuan) and supports OpenAI-compatible API endpoints, making it easy to integrate with existing ecosystems and tools like LangChain. The project aims to lower the barrier to entry for LLM experimentation and application development.

How It Works

The library leverages a modular design, allowing users to load and interact with different LLMs through a consistent API. It supports RAG (Retrieval-Augmented Generation) for knowledge base integration, enabling LLMs to answer questions based on provided documents (PDF, DOCX, TXT, MD). For OpenAI compatibility, it can run a local server that mimics the OpenAI API, allowing standard OpenAI SDKs and clients to connect to local or supported LLMs.

Quick Start & Requirements

  • Install: pip install -U chatllm
  • For OpenAI API compatibility: pip install "chatllm[openai]"
  • For PDF support: pip install "chatllm[pdf]"
  • Dependencies: Python, specific LLM model weights (e.g., THUDM/chatglm-6b), potentially GPU for larger models.
  • Docs: docs/INSTALL.md
  • Demo: api.chatllm.vip

Highlighted Details

  • Supports multiple Chinese LLMs (ChatGLM, Wenxin Yiyan, Spark, Hunyuan).
  • Provides an OpenAI-compatible API server for seamless integration.
  • Includes RAG capabilities for document-based Q&A (ChatPDF).
  • Offers OCR functionality for text extraction from images (ChatOCR).
  • Features a web UI for ChatMind and ChatPDF applications.

Maintenance & Community

  • Community contact via WeChat (313303303) for token rewards.
  • Active development roadmap includes expanding LLM and embedding model support, enhancing RAG features, and improving API/UI capabilities.

Licensing & Compatibility

  • The primary license is not explicitly stated in the README, but usage examples and community discussions suggest it's intended for broad use. Compatibility with commercial or closed-source projects would require clarification on licensing terms.

Limitations & Caveats

  • The project is actively under development with many features listed as "TODO" in the roadmap, indicating potential for breaking changes or incomplete functionality.
  • Hardware requirements for running LLMs locally, especially larger models like ChatGLM-6B, can be significant (e.g., 6GB+ VRAM for INT4 quantization).
  • Reliance on specific model providers or Hugging Face repositories for model weights.
Health Check
Last commit

10 months ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
2 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.