Discover and explore top open-source AI tools and projects—updated daily.
Telegram bot for LLM interaction
Top 73.7% on SourcePulse
This project provides a Telegram bot that allows users to interact with Large Language Models (LLMs) hosted via Ollama. It's designed for Telegram users who want a convenient way to chat with AI models directly from their messaging app, offering features like response streaming and group chat integration.
How It Works
The bot leverages the Aiogram 3.x library for Telegram integration and connects to an Ollama instance for LLM inference. It supports response streaming using a "SentenceBySentence" method, which aims to mitigate rate limits by sending responses incrementally. The bot can be configured to run with or without Docker, and includes options for GPU acceleration via Docker.
Quick Start & Requirements
docker pull ruecat/ollama-telegram
.env.example
, rename to .env
, and populate variables.docker-compose.yml
.docker compose up -d
.git clone https://github.com/ruecat/ollama-telegram
pip install -r requirements.txt
.env
file.python3 run.py
Highlighted Details
/reset
command.Maintenance & Community
The project acknowledges contributions from StanleyOneG and ShrirajHegde. The roadmap indicates plans for more API functions and Redis integration.
Licensing & Compatibility
The project uses the MIT license, allowing for commercial use and integration with closed-source applications.
Limitations & Caveats
The Docker image currently does not support Apple GPU acceleration. The roadmap indicates that advanced API functions and Redis integration are still under development.
2 weeks ago
Inactive