ChatGPTCLIBot  by LagPixelLOL

CLI tool for GPT models with long-term memory (no longer maintained)

created 2 years ago
343 stars

Top 81.8% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a C++ command-line interface (CLI) for interacting with OpenAI's GPT models, offering long-term memory via embeddings and custom document Q&A. It targets users who prefer a terminal-based experience over web GUIs and seek enhanced memory capabilities beyond the standard context window.

How It Works

The bot leverages embeddings to store and retrieve conversation history and custom documents, effectively creating a near-infinite memory capacity limited only by disk space. It streams responses token-by-token, mimicking the ChatGPT web experience, and supports features like chat history saving/loading, custom initial prompts, and undo/reset commands.

Quick Start & Requirements

  • Install/Run: Download GPT3Bot.exe or run.bat for Windows, or ./GPT3Bot for Linux/macOS.
  • Prerequisites: C++17 compiler, Boost, cURL, nlohmann/json, libproxy, cpp-terminal, ftxui, oneTBB, clip, cpp-tiktoken, pcre2, utf8proc. Windows system proxy support requires a specific proxy library.
  • OS: Windows 10/11 64-bit, Linux 64-bit (Ubuntu 20.04+, CentOS 8+), macOS 12+.
  • Links: Stable Release, Development Build

Highlighted Details

  • Long-term memory via embeddings for near-infinite context.
  • Q&A support for custom documents.
  • Token-by-token streaming output.
  • Multiline input support (Ctrl+N or Alt+Enter).
  • UTF-8 and terminal color support.

Maintenance & Community

The project is explicitly marked as "NO LONGER MAINTAINED." The author cites advancements in LLM context windows and integrated RAG as reasons for discontinuation, recommending alternative GUIs like SillyTavern and LobeChat.

Licensing & Compatibility

The README does not specify a license. Compatibility for commercial use or closed-source linking is not addressed.

Limitations & Caveats

The project is no longer maintained, meaning no future updates or bug fixes are expected. The author suggests that direct higher context windows in newer LLMs are now preferable to embedding-based memory solutions. The system proxy feature is Windows-only due to compilation issues on other OS.

Health Check
Last commit

1 year ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
2 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.