GongBU  by Bolin97

Local LLM fine-tuning platform for domain-specific adaptation

Created 1 year ago
693 stars

Top 49.1% on SourcePulse

GitHubView on GitHub
Project Summary

GongBU is a no-code, web-based platform for local fine-tuning of large language models, designed for domain-specific adaptation. It targets non-technical users, enabling them to fine-tune, evaluate, and deploy models directly through a browser interface, simplifying LLM customization.

How It Works

Built upon libraries like Transformers and Peft, GongBU abstracts the complexities of LLM fine-tuning into an intuitive, browser-driven workflow. It facilitates model adaptation without requiring users to write code. The platform manages dataset uploads, model downloads, and the fine-tuning process, offering a streamlined, end-to-end solution for domain-specific LLM deployment.

Quick Start & Requirements

The recommended installation involves Docker Compose on a Linux system with NVIDIA Docker. Users clone the repository, run download.py (requiring inquirer and rich) to fetch necessary files like micromamba and a BERT model, install Docker and NVIDIA Docker, and then execute bash docker compose -f docker-compose.prod.yaml up. The platform is accessible via localhost:5173/home. Native installation is possible with correct configurations.

Highlighted Details

  • Features a CIKM 2024 accepted paper.
  • Utilizes bun for enhanced JavaScript bundling and runtime performance.
  • Under active development, with potential for breaking changes, particularly in the frontend.

Maintenance & Community

The platform is under continuous development and polishing. The provided README does not detail specific community channels (e.g., Discord, Slack), roadmap links, or notable contributors.

Licensing & Compatibility

The project is distributed under the MIT License. While generally permissive, the README does not explicitly state compatibility notes for commercial use or closed-source linking.

Limitations & Caveats

The platform transmits sensitive data (username, password, dataset) in plain text between client and server, necessitating manual SSL configuration for secure operation in open networks. Ongoing development means breaking changes are possible. The developer disclaims responsibility for any loss or damage, as per the MIT license.

Health Check
Last Commit

1 month ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
624 stars in the last 30 days

Explore Similar Projects

Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Gabriel Almeida Gabriel Almeida(Cofounder of Langflow), and
2 more.

torchchat by pytorch

0%
4k
PyTorch-native SDK for local LLM inference across diverse platforms
Created 1 year ago
Updated 5 months ago
Starred by Andrej Karpathy Andrej Karpathy(Founder of Eureka Labs; Formerly at Tesla, OpenAI; Author of CS 231n), Stefan van der Walt Stefan van der Walt(Core Contributor to scientific Python ecosystem), and
12 more.

litgpt by Lightning-AI

0.1%
13k
LLM SDK for pretraining, finetuning, and deploying 20+ high-performance LLMs
Created 2 years ago
Updated 4 days ago
Feedback? Help us improve.