Discover and explore top open-source AI tools and projects—updated daily.
AI-driven evolution for system text components
Top 50.2% on SourcePulse
GEPA is a framework for optimizing text-based components within any system, such as AI prompts, code, or specifications, using an evolutionary approach driven by LLM reflection. It targets developers and researchers seeking to enhance system performance by iteratively refining these text components against defined evaluation metrics, offering a method to achieve robust, high-performing variants with efficient evaluation budgets.
How It Works
GEPA employs a "Reflective Text Evolution" strategy. It uses Large Language Models (LLMs) to analyze feedback from system execution and evaluation traces, reflecting on performance to generate targeted mutations for text components. Candidates are iteratively mutated, evaluated, and selected using a Pareto-aware approach, allowing for the co-evolution of multiple components within modular systems to achieve domain-specific improvements.
Quick Start & Requirements
pip install gepa
or pip install git+https://github.com/gepa-ai/gepa.git
openai/gpt-4.1-mini
, openai/gpt-5
). An OPENAI_API_KEY
environment variable is necessary for the provided examples.Highlighted Details
dspy.GEPA
API for seamless integration with the DSPy framework, simplifying prompt optimization tasks.GEPAAdapter
interface, enabling GEPA to plug into diverse systems, including single-turn LLM interactions, multi-turn agents (e.g., terminal-bench
), and full program evolution.Maintenance & Community
The project is associated with authors from the paper, including Lakshya A Agrawal and Matei Zaharia. Community engagement is encouraged via GitHub issues for support and feature requests, and discussions can be held on Discord. Updates and announcements are shared on X (formerly Twitter) via @LakshyAAAgrawal and @lateinteraction.
Licensing & Compatibility
The provided README does not explicitly state the software license. Users should verify licensing terms before adoption, especially concerning commercial use or integration into closed-source projects.
Limitations & Caveats
Practical application requires access to and configuration of specific LLMs, often necessitating API keys. The optimization process itself can be computationally intensive and may require careful tuning of parameters like max_metric_calls
to balance performance gains with evaluation budgets.
3 days ago
Inactive