oterm is a terminal-based client for Ollama, designed for users who want a direct, serverless interface to interact with local large language models. It provides a persistent, customizable chat experience within the terminal, supporting features like multiple sessions, system prompt and parameter customization, and integration with external tools via the Model Context Protocol (MCP).
How It Works
oterm leverages a direct connection to Ollama, eliminating the need for separate server or frontend processes. It stores chat sessions, system prompts, and parameter customizations in an SQLite database for persistence. The client supports the Model Context Protocol (MCP) for tool integration, allowing users to create custom commands and provide external information to LLMs directly from the terminal. It also features sixel graphics support for in-terminal image display.
Quick Start & Requirements
uvx oterm
(see Installation).Highlighted Details
Maintenance & Community
The project is actively maintained by ggozad. Further community engagement details are not specified in the README.
Licensing & Compatibility
Licensed under the MIT License. This license is permissive and generally compatible with commercial and closed-source applications.
Limitations & Caveats
oterm relies on Ollama being installed and accessible. While it supports multiple platforms, specific terminal emulator compatibility may vary, especially for advanced features like sixel graphics.
1 month ago
1 day