ollama-telegram  by rikkichy

Telegram bot for LLM interaction

Created 1 year ago
390 stars

Top 73.7% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a Telegram bot that allows users to interact with Large Language Models (LLMs) hosted via Ollama. It's designed for Telegram users who want a convenient way to chat with AI models directly from their messaging app, offering features like response streaming and group chat integration.

How It Works

The bot leverages the Aiogram 3.x library for Telegram integration and connects to an Ollama instance for LLM inference. It supports response streaming using a "SentenceBySentence" method, which aims to mitigate rate limits by sending responses incrementally. The bot can be configured to run with or without Docker, and includes options for GPU acceleration via Docker.

Quick Start & Requirements

  • Installation (Docker):
    • Pull the official image: docker pull ruecat/ollama-telegram
    • Download .env.example, rename to .env, and populate variables.
    • Optionally, uncomment the GPU section in the provided docker-compose.yml.
    • Start with docker compose up -d.
  • Installation (Non-Docker):
    • Clone the repository: git clone https://github.com/ruecat/ollama-telegram
    • Install dependencies: pip install -r requirements.txt
    • Configure .env file.
    • Launch: python3 run.py
  • Prerequisites: Telegram Bot Token, Admin and User IDs. Ollama instance (local or Dockerized). GPU support requires NVIDIA drivers and CUDA.

Highlighted Details

  • Fully Dockerized option available.
  • Response streaming implemented to avoid rate limits.
  • Supports group chat interaction and bot mentions.
  • Includes basic history and a /reset command.

Maintenance & Community

The project acknowledges contributions from StanleyOneG and ShrirajHegde. The roadmap indicates plans for more API functions and Redis integration.

Licensing & Compatibility

The project uses the MIT license, allowing for commercial use and integration with closed-source applications.

Limitations & Caveats

The Docker image currently does not support Apple GPU acceleration. The roadmap indicates that advanced API functions and Redis integration are still under development.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
7 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.