just-prompt  by disler

MCP server for unified LLM provider access

created 4 months ago
566 stars

Top 57.7% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a unified Model Control Protocol (MCP) server for interacting with multiple Large Language Model (LLM) providers, including OpenAI, Anthropic, Google Gemini, Groq, DeepSeek, and Ollama. It offers a consistent interface for sending prompts via text or files, running models in parallel, and includes specialized tools like a "CEO and Board" decision-making mechanism. This is beneficial for developers and researchers needing to easily compare and leverage diverse LLM capabilities without managing individual provider APIs.

How It Works

The server implements the MCP protocol, abstracting away the complexities of different LLM APIs. It uses a modular design with provider-specific implementations in src/just_prompt/atoms/llm_providers/. Users interact with tools like prompt, prompt_from_file, and ceo_and_board, specifying models with provider prefixes (e.g., openai:gpt-4o). The architecture supports parallel execution and includes features for controlling model reasoning effort (OpenAI) and thinking tokens/budget (Anthropic, Gemini) via model name suffixes.

Quick Start & Requirements

  • Install via uv sync after cloning the repository.
  • Requires API keys for desired LLM providers, configured via a .env file or environment variables.
  • Supports OpenAI's reasoning effort control and Anthropic/Gemini's thinking tokens/budget via model name suffixes.

Highlighted Details

  • Unified interface for OpenAI, Anthropic, Gemini, Groq, DeepSeek, and Ollama.
  • ceo_and_board tool for multi-model decision-making.
  • Supports controlling reasoning effort (:low, :medium, :high) for OpenAI models.
  • Supports thinking tokens/budget for specific Anthropic (claude-3-7-sonnet-20250219) and Gemini (gemini-2.5-flash-preview-04-17) models.
  • Automatic model name correction and provider availability checks.

Maintenance & Community

The project is hosted on GitHub at disler/just-prompt. Further community or maintenance details are not explicitly provided in the README.

Licensing & Compatibility

The repository's pyproject.toml indicates it uses uv for dependency management. The specific license is not stated in the README, which may impact commercial use or closed-source integration.

Limitations & Caveats

The README does not specify the project's license, which is a critical factor for adoption. It also lacks explicit details on testing coverage or community support channels.

Health Check
Last commit

1 month ago

Responsiveness

Inactive

Pull Requests (30d)
2
Issues (30d)
1
Star History
396 stars in the last 90 days

Explore Similar Projects

Starred by Tobi Lutke Tobi Lutke(Cofounder of Shopify), Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), and
15 more.

litellm by BerriAI

1.9%
27k
SDK/proxy for calling 100+ LLM APIs using the OpenAI format
created 2 years ago
updated 21 hours ago
Feedback? Help us improve.