Game mod for interactive, LLM-powered NPC conversations
Top 51.9% on sourcepulse
This project enables real-time, voice-activated conversations with NPCs in any game, enhancing immersion by allowing players to speak directly to characters. It targets players of open-world games like Cyberpunk 2077 and Assassin's Creed, offering a novel way to interact with existing game worlds without modifying game code.
How It Works
The system captures microphone input, transcribes speech, and uses facial recognition to identify the target NPC. This information, along with character-specific data from vector stores and pre-conversation files, is fed to an LLM for response generation. The LLM's output is converted to speech and used to create lip-synced facial animations via SadTalker, which then replaces the NPC's in-game face pixels. Webcam input can also be used for emotion recognition to tailor NPC responses.
Quick Start & Requirements
pip install -r requirements.txt
, run scripts/download_models.sh
, then webui.bat
from the sadtalker
directory.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project relies on specific versions of Python and external tools, and setup can be complex, especially for non-Windows users or those unfamiliar with LLM integration and multimedia processing. Performance may vary based on hardware, particularly for real-time facial animation rendering.
1 year ago
1 day