DL framework for training at scale, optimized for large-scale clusters
Top 9.5% on sourcepulse
Composer is an open-source PyTorch library designed to simplify and accelerate deep learning model training at scale. It targets researchers and engineers working with large models like LLMs, diffusion models, and transformers, abstracting complexities of distributed training, data loading, and memory optimization to enable faster experimentation and iteration.
How It Works
Composer centers around a highly optimized Trainer
abstraction that streamlines PyTorch training loops. It integrates advanced parallelism techniques like PyTorch FullyShardedDataParallelism (FSDP) and standard Distributed Data Parallelism (DDP) for efficient multi-node training. A flexible callback system allows users to inject custom logic at various training stages, while built-in speedup algorithms, inspired by recent research, can be composed into "recipes" to significantly boost training throughput.
Quick Start & Requirements
pip install mosaicml
Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
5 days ago
1 day