lollms_legacy  by ParisNeo

LLM text generation server with multiple personalities

Created 2 years ago
294 stars

Top 89.9% on SourcePulse

GitHubView on GitHub
Project Summary

LoLLMs Server provides a Flask-based API and WebSocket interface for generating text using various large language models. It's designed for developers and researchers to easily integrate LLM capabilities into applications, offering features like multiple personalities, real-time generation, and local data handling.

How It Works

The server acts as a unified interface to different LLM bindings, allowing users to select and load models from Hugging Face or local storage. It supports multiple "personalities" to influence generation style and uses WebSockets for real-time communication, enabling interactive chat applications. The safe_generate method handles context window management, preventing overflow.

Quick Start & Requirements

  • Install via pip: pip install --upgrade lollms or pip install --upgrade git+https://github.com/ParisNeo/lollms_legacy.git
  • GPU support requires CUDA, installable via Conda: conda create --name lollms python=3.10, conda activate lollms, conda install -c anaconda cudatoolkit.
  • Configuration is done via lollms-settings.
  • Models are loaded from Hugging Face URLs or local paths.
  • Official documentation and examples are available in the repository.

Highlighted Details

  • Supports multiple LLM bindings (e.g., llama_cpp_official).
  • Enables custom personalities for varied text generation styles.
  • Offers real-time text generation via WebSockets.
  • Allows sending files to personalities for context.
  • Can run on multiple nodes for distributed generation.

Maintenance & Community

The project is hosted on GitHub at https://github.com/ParisNeo/lollms_legacy. Contribution guidelines are available in CONTRIBUTING.md.

Licensing & Compatibility

Licensed under the Apache 2.0 License. This license is permissive and generally compatible with commercial and closed-source applications.

Limitations & Caveats

The project is named lollms_legacy, suggesting it might be an older or superseded version. Specific details on active maintenance or potential deprecation are not explicitly stated in the README.

Health Check
Last Commit

1 month ago

Responsiveness

Inactive

Pull Requests (30d)
1
Issues (30d)
0
Star History
0 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.