Telegram chatbot using transformers for multi-turn dialogue
Top 69.1% on sourcepulse
This project provides a Telegram chatbot powered by large language models, specifically DialoGPT, for engaging multi-turn conversations. It's designed for users interested in deploying AI-powered conversational agents, offering a fun, community-like interaction style due to its Reddit-based training data.
How It Works
The chatbot leverages Hugging Face's transformers
library to run pre-trained dialogue generation models like DialoGPT. DialoGPT, trained on extensive Reddit dialogue, provides human-comparable response quality in single-turn Turing tests. For enhanced dialogue quality, DialogRPT models, trained on human feedback, are employed for response ranking. Users can also integrate other text generators supported by transformers
.
Quick Start & Requirements
pip install -r requirements.txt
after cloning the repository..cfg
) specify model size (e.g., medium-cpu
, large-gpu
) and ranking strategies.python run_bot.py --type=console
.Highlighted Details
Maintenance & Community
No specific contributor or community links (Discord, Slack, etc.) are mentioned in the README.
Licensing & Compatibility
The README does not explicitly state the license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The README suggests manual tuning of parameters like temperature
, top_k
, and top_p
is necessary to achieve desired conversational styles, implying a trial-and-error process for optimal performance. The model's Reddit training data can lead to unpredictable or "off-topic" responses, especially with higher temperature settings.
1 year ago
1 day