Large model data assistant for streamlined app development
Top 41.1% on sourcepulse
This project provides a lightweight, full-stack framework for developing large model applications, targeting developers and researchers looking for a rapid deployment solution. It integrates various LLM technologies, data visualization, and RAG capabilities for versatile data interaction and general knowledge Q&A.
How It Works
The framework is built upon Dify for orchestration, Ollama/vLLM for local model serving (supporting models like Qwen2.5 and DeepSeek), and Sanic for the backend API. It features a Vue3/TypeScript/Vite frontend for a modern UI. Key functionalities include Text2SQL for data visualization via ECharts and direct CSV file analysis, alongside RAG integration for broader knowledge retrieval.
Quick Start & Requirements
docker-compose up -d
for a full stack with Dify, Ollama, and the application services.Highlighted Details
Maintenance & Community
The project is maintained by apconw. Community interaction is encouraged via WeChat groups for technical support and discussion.
Licensing & Compatibility
Limitations & Caveats
The project requires significant setup with multiple dependencies (Dify, MySQL, Ollama, Node.js). While offering one-click deployment for Dify, other components require manual configuration and deployment. The README indicates potential for paid one-on-one technical support, suggesting limited free support bandwidth.
2 days ago
1 day