torchopt  by metaopt

PyTorch library for differentiable optimization

Created 3 years ago
615 stars

Top 53.5% on SourcePulse

GitHubView on GitHub
Project Summary

TorchOpt is a PyTorch library for efficient differentiable optimization, targeting researchers and practitioners in meta-learning and related fields. It provides a flexible, functional API for implementing complex optimization algorithms, enabling end-to-end training of models where optimization steps are part of the learning process.

How It Works

TorchOpt offers a functional programming paradigm, inspired by JAX's Optax, allowing users to compose optimizers and apply them to model parameters. It supports three core differentiation modes: Explicit Gradient (EG) for unrolled optimization paths, Implicit Gradient (IG) using analytical derivatives from stationary conditions, and Zero-order Differentiation (ZD) for non-differentiable or computationally intensive inner loops. This approach facilitates meta-learning by allowing gradients to flow through optimization steps.

Quick Start & Requirements

Highlighted Details

  • Supports explicit, implicit, and zero-order differentiation for meta-learning.
  • Offers both functional (Optax-like) and PyTorch-style OOP APIs.
  • Includes CPU/GPU accelerated optimizers and RPC-based distributed training.
  • Features a PyTree implementation (OpTree) for efficient nested structure handling.
  • Provides gradient visualization tools to aid debugging complex meta-learning setups.

Maintenance & Community

The project is developed by Jie Ren, Xidong Feng, Bo Liu, Xuehai Pan, Luo Mai, and Yaodong Yang. A changelog is available.

Licensing & Compatibility

Released under the Apache License, Version 2.0. This license is permissive and generally compatible with commercial use and closed-source linking.

Limitations & Caveats

While offering multiple differentiation modes, the complexity of implementing and debugging meta-learning algorithms can still be significant. Implicit gradient methods may require careful definition of stationary conditions.

Health Check
Last Commit

2 weeks ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
0
Star History
4 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.