Chat module for interacting with OpenAI's ChatGPT via CLI or web interface
Top 80.0% on sourcepulse
This project provides a chat module built on the tinystruct framework, enabling instant messaging with file sharing and integration with OpenAI's GPT models. It targets developers and users seeking a flexible chat solution that can be accessed via CLI or a web interface, offering direct interaction with advanced language models.
How It Works
The system leverages the tinystruct framework, a Java-based micro-framework, to build a chat application. It integrates with OpenAI's GPT-4, GPT-3.5-turbo, and ChatGPT models, allowing users to interact with these AI capabilities through a command-line interface or a web UI. The architecture supports Comet technology for real-time updates without WebSockets, enhancing browser compatibility.
Quick Start & Requirements
application.properties
with openai.api_key
or set the OPENAI_API_KEY
environment variable../mvnw compile
to compile. Run via CLI with bin/dispatcher chat
or start a web server (Tomcat/Netty) with sudo bin/dispatcher start --import org.tinystruct.system.TomcatServer --server-port 777
. Docker image available: m0ver/smalltalk
.http://localhost:777/?q=talk
.Highlighted Details
Maintenance & Community
The project welcomes contributions. Further details on the development process and coding standards are available in CONTRIBUTING.md
.
Licensing & Compatibility
Licensed under the Apache License, Version 2.0. This license permits commercial use and linking with closed-source projects.
Limitations & Caveats
The project relies on external OpenAI API keys, incurring usage costs. The README mentions potential troubleshooting steps but does not detail specific known issues or limitations of the tinystruct framework itself.
2 weeks ago
1 day