PyTorch library for differentiable optimization
Top 54.8% on sourcepulse
TorchOpt is a PyTorch library for efficient differentiable optimization, targeting researchers and practitioners in meta-learning and related fields. It provides a flexible, functional API for implementing complex optimization algorithms, enabling end-to-end training of models where optimization steps are part of the learning process.
How It Works
TorchOpt offers a functional programming paradigm, inspired by JAX's Optax, allowing users to compose optimizers and apply them to model parameters. It supports three core differentiation modes: Explicit Gradient (EG) for unrolled optimization paths, Implicit Gradient (IG) using analytical derivatives from stationary conditions, and Zero-order Differentiation (ZD) for non-differentiable or computationally intensive inner loops. This approach facilitates meta-learning by allowing gradients to flow through optimization steps.
Quick Start & Requirements
pip3 install torchopt
--extra-index-url https://download.pytorch.org/whl/cu121
).Highlighted Details
Maintenance & Community
The project is developed by Jie Ren, Xidong Feng, Bo Liu, Xuehai Pan, Luo Mai, and Yaodong Yang. A changelog is available.
Licensing & Compatibility
Released under the Apache License, Version 2.0. This license is permissive and generally compatible with commercial use and closed-source linking.
Limitations & Caveats
While offering multiple differentiation modes, the complexity of implementing and debugging meta-learning algorithms can still be significant. Implicit gradient methods may require careful definition of stationary conditions.
3 weeks ago
1 day