PyTorch library for optimizers, LR schedulers, and loss functions
Top 85.2% on sourcepulse
This repository provides a comprehensive collection of optimizers, learning rate schedulers, and loss functions for PyTorch, aiming to simplify and enhance deep learning model training. It is designed for researchers and practitioners seeking to experiment with a wide array of optimization techniques beyond standard offerings, potentially improving convergence speed and model generalization.
How It Works
The library offers a unified interface to over 100 optimizers, 16 LR schedulers, and 13 loss functions. It integrates popular and novel algorithms, including variants of Adam, SGD, and Sharpness-Aware Minimization (SAM), along with specialized optimizers like Lion and Prodigy. The core design emphasizes ease of use, allowing users to load optimizers by name or through a create_optimizer
function, and supports integration with libraries like bitsandbytes
for 8-bit optimization.
Quick Start & Requirements
pip3 install pytorch-optimizer
--ignore-requires-python
).bitsandbytes
, q-galore-torch
, and torchao
require separate installation.Highlighted Details
bitsandbytes
for 8-bit optimizers and memory-efficient training.torch.hub
loading option for easy experimentation.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The library includes optimizers with non-commercial licenses, requiring careful attention from users intending to use them in commercial projects.
2 days ago
1 day