Discover and explore top open-source AI tools and projects—updated daily.
SDK for LLM-powered applications, compatible with OpenAI & LangChain
Top 67.3% on SourcePulse
ChatLLM is a Python library designed to simplify the use of Large Language Models (LLMs), particularly for Chinese users. It provides a unified interface to various domestic LLMs (like ChatGLM, Wenxin Yiyan, Spark, Hunyuan) and supports OpenAI-compatible API endpoints, making it easy to integrate with existing ecosystems and tools like LangChain. The project aims to lower the barrier to entry for LLM experimentation and application development.
How It Works
The library leverages a modular design, allowing users to load and interact with different LLMs through a consistent API. It supports RAG (Retrieval-Augmented Generation) for knowledge base integration, enabling LLMs to answer questions based on provided documents (PDF, DOCX, TXT, MD). For OpenAI compatibility, it can run a local server that mimics the OpenAI API, allowing standard OpenAI SDKs and clients to connect to local or supported LLMs.
Quick Start & Requirements
pip install -U chatllm
pip install "chatllm[openai]"
pip install "chatllm[pdf]"
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
1 year ago
Inactive