evosax  by RobertTLange

Evolution Strategies in JAX for high-throughput hardware acceleration

created 4 years ago
650 stars

Top 52.3% on sourcepulse

GitHubView on GitHub
Project Summary

evosax provides a high-performance, JAX-based library for implementing Evolution Strategies (ES). It targets researchers and practitioners in areas like neuroevolution and black-box optimization, enabling efficient scaling of evolutionary algorithms on modern hardware accelerators (CPUs, GPUs, TPUs) without complex distributed setups. The library offers a unified API for over 30 ES algorithms, including classics and recent advancements, fully leveraging JAX's compilation and transformation capabilities.

How It Works

evosax utilizes JAX's XLA compilation and transformation primitives (jit, vmap, lax.scan) to achieve high throughput and scalability. The core API follows the "ask-evaluate-tell" cycle common in ES. This design allows for efficient vectorization across populations and parallel execution on accelerators, abstracting away the complexities of distributed computing while enabling seamless integration with JAX's functional programming paradigm.

Quick Start & Requirements

  • Install via pip: pip install evosax
  • Requires Python 3.10+ and a working JAX installation.
  • Upgrade to the latest version: pip install git+https://github.com/RobertTLange/evosax.git@main
  • Examples and documentation are available: Basic evosax API Usage, Examples

Highlighted Details

  • Implements over 30 classic and modern Evolution Strategies (e.g., CMA-ES, OpenAI-ES, Diffusion Evolution).
  • Fully compatible with JAX transformations (jit, vmap, lax.scan) for acceleration and scalability.
  • Optimized for high-throughput execution on CPUs, GPUs, and TPUs.
  • Includes built-in optimizers (ClipUp), fitness shaping, and restart strategies.

Maintenance & Community

Licensing & Compatibility

  • Licensed under the MIT License.
  • Permissive license suitable for commercial use and integration into closed-source projects.

Limitations & Caveats

The library contains independent reimplementations of LES and DES and is stated to be unrelated to Google DeepMind. While tested to reproduce official results, users should be aware of potential discrepancies in highly specific research contexts.

Health Check
Last commit

1 month ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
0
Star History
45 stars in the last 90 days

Explore Similar Projects

Starred by Jiayi Pan Jiayi Pan(Author of SWE-Gym; AI Researcher at UC Berkeley), Thomas Wolf Thomas Wolf(Cofounder of Hugging Face), and
3 more.

levanter by stanford-crfm

0.5%
628
Framework for training foundation models with JAX
created 3 years ago
updated 1 day ago
Starred by Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), Jiayi Pan Jiayi Pan(Author of SWE-Gym; AI Researcher at UC Berkeley), and
11 more.

alpa by alpa-projects

0.2%
3k
Auto-parallelization framework for large-scale neural network training and serving
created 4 years ago
updated 1 year ago
Feedback? Help us improve.