Web UI for local ChatGLM deployment
Top 65.7% on sourcepulse
This project provides a self-hostable web interface for the ChatGLM language model, aiming to replicate the user experience of ChatGPT. It's designed for users who want to run a powerful conversational AI locally, offering offline capabilities and the flexibility to use custom-tuned GLM models.
How It Works
The project utilizes FastAPI for the backend API and Vue 3 for the frontend. It supports streaming output from ChatGLM models, allowing users to adjust parameters, manage conversation history, and save outputs as images. The architecture is forked from existing popular ChatGPT web UIs, incorporating features like knowledge base Q&A, though some advanced features from the original repositories are still under development.
Quick Start & Requirements
python main.py
(with optional arguments for device, quantization, host, port). Requires Python 3.8+.pnpm bootstrap
then pnpm dev
. Requires Node.js (v16 or v18 recommended).pnpm
for frontend, requirements.txt
for backend.python gen_data.py
before starting the API.Highlighted Details
Maintenance & Community
The project is a fork of Chanzhaoyu/chatgpt-web
and WenJing95/chatgpt-web
. Contributions are guided by a contribution guide.
Licensing & Compatibility
MIT License. Permissive for commercial use and integration with closed-source applications.
Limitations & Caveats
Some features from parent repositories (e.g., permissions, prompt store, Langchain integration) are marked as "to be implemented." Docker deployment instructions are pending. The project relies on specific Node.js versions and may require manual configuration for certain frontend behaviors like typewriter effects when behind reverse proxies.
2 years ago
1 day