KoboldAI-Client  by KoboldAI

Browser-based front-end for AI-assisted writing with local & remote AI models

Created 4 years ago
3,782 stars

Top 12.8% on SourcePulse

GitHubView on GitHub
Project Summary

KoboldAI is a browser-based front-end for AI-assisted writing, targeting writers, role-players, and chatbot enthusiasts. It provides a versatile platform for interacting with various local and remote AI models, offering features like memory, author's notes, and world info to enhance creative writing and interactive storytelling.

How It Works

KoboldAI acts as an interface to AI language models, allowing users to select and configure different models for various writing styles: Novel, Adventure, Chatbot, and Hybrid. It supports multiple input methods and output formatting, enabling users to tailor their experience from traditional novel writing to interactive text adventures or conversational AI. The project emphasizes flexibility, allowing users to run models on Google Colab, their own hardware (CPU/GPU), or through remote services.

Quick Start & Requirements

  • Local Installation: Download the latest offline installer for Windows or clone the GitHub repository and run install_requirements.bat (Windows) or install_requirements.sh (Linux).
  • Prerequisites: Python 3.x, CUDA for NVIDIA GPUs, ROCm for AMD GPUs (Linux only).
  • Google Colab: Easiest way to start, with TPU and GPU editions available.
  • Resources: Local installation requires significant disk space for models (20GB+ for the client).
  • Documentation: koboldai.org/cpp, koboldai.org/united

Highlighted Details

  • Supports multiple AI model types (Novel, Adventure, Chatbot, Hybrid) and styles.
  • Offers features like Memory, Author's Note, and World Info for enhanced writing control.
  • Provides an "Adventure Mode" mimicking AI Dungeon, with flexible character control.
  • Includes a REST API for programmatic interaction.

Maintenance & Community

Key contributors include The Gantian (creator), VE FORBRYDERNE, and Henk717. Community support is available via Discord.

Licensing & Compatibility

Licensed under AGPL. Publicly available instances must provide source code access. umamba.exe is bundled under BSD-3-Clause. Commercial use is permitted, but modifications must be shared under AGPL.

Limitations & Caveats

The project README strongly directs users to KoboldCpp for GGUF support and KoboldAI United for newer models (Llama, Exllama). Local installation can be complex due to Python dependency management, with potential conflicts across different Python versions. AMD GPU support is limited to specific Linux distributions with ROCm.

Health Check
Last Commit

8 months ago

Responsiveness

Inactive

Pull Requests (30d)
1
Issues (30d)
0
Star History
19 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.