OpenELM: evolutionary search with language models in code and natural language
Top 48.4% on sourcepulse
OpenELM is an open-source library by CarperAI that enables evolutionary search with large language models (LLMs) for both code and natural language generation. It targets researchers and developers looking to explore novel AI capabilities through evolutionary algorithms, offering flexibility for local or API-based LLM integration and diverse computational profiles.
How It Works
OpenELM integrates LLMs with quality-diversity (QD) algorithms like MAP-Elites and a simple genetic algorithm. It supports prompt-based mutation, specialized diff models for code, and crossover with LLMs. LLMs are instantiated via Langchain, allowing broad compatibility with local HuggingFace models or API-based services. For high-throughput scenarios, it offers optional Nvidia Triton Inference Server support.
Quick Start & Requirements
pip install openelm
pip install swig
then pip install openelm[sodaracer]
pyproject.toml
for further install options.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The project is actively developed, with some environments and features residing in experimental branches, indicating potential for ongoing changes and API instability.
1 year ago
1 day