chronos-forecasting  by amazon-science

Pretrained models for probabilistic time series forecasting research

created 1 year ago
3,477 stars

Top 14.2% on sourcepulse

GitHubView on GitHub
Project Summary

Chronos is a family of pretrained probabilistic time series forecasting models that leverage language model architectures. It transforms time series into token sequences, enabling efficient forecasting for researchers and practitioners. The project offers various model sizes and versions, including the faster and more accurate Chronos-Bolt, with extensive documentation and integration options.

How It Works

Chronos treats time series forecasting as a language modeling task. Input time series are scaled and quantized into discrete tokens. These tokens are then processed by T5-based language models (encoder-decoder or decoder-only). During inference, the model autoregressively samples future token trajectories, which are then de-quantized to produce probabilistic forecasts. This approach allows for zero-shot generalization to unseen datasets and efficient inference.

Quick Start & Requirements

  • Install via pip: pip install chronos-forecasting
  • For research and development, clone the repository and install from source: pip install --editable ".[training]"
  • Requires Python and PyTorch. GPU with CUDA is recommended for optimal performance.
  • Example usage and tutorials for AutoGluon and SageMaker deployment are available.

Highlighted Details

  • Offers Chronos and Chronos-Bolt models, with Bolt versions up to 250x faster and 20x more memory efficient.
  • Achieves strong zero-shot performance across numerous datasets compared to local and other pretrained models.
  • Supports MLX inference for Apple Silicon Macs for faster forecasting.
  • Provides functionality to extract encoder embeddings from Chronos models.

Maintenance & Community

  • Developed by Amazon Science.
  • Active development with recent updates including Chronos-Bolt release and benchmarking tools (fev).
  • Integration with AutoGluon-TimeSeries for AutoML capabilities.
  • SageMaker JumpStart integration for easy deployment.

Licensing & Compatibility

  • Licensed under the Apache-2.0 License.
  • Permissive license suitable for commercial use and integration into closed-source projects.

Limitations & Caveats

The project is primarily intended for research purposes, with recommendations for production use pointing to AutoGluon or SageMaker. A minor bug affecting metric computation was recently fixed, with updated results pending.

Health Check
Last commit

2 months ago

Responsiveness

1 day

Pull Requests (30d)
2
Issues (30d)
1
Star History
257 stars in the last 90 days

Explore Similar Projects

Starred by Jeremy Howard Jeremy Howard(Cofounder of fast.ai) and Stas Bekman Stas Bekman(Author of Machine Learning Engineering Open Book; Research Engineer at Snowflake).

SwissArmyTransformer by THUDM

0.3%
1k
Transformer library for flexible model development
created 3 years ago
updated 7 months ago
Feedback? Help us improve.