OpenLTM  by thuml

Open codebase for large time-series model development and evaluation

created 9 months ago
390 stars

Top 74.7% on sourcepulse

GitHubView on GitHub
Project Summary

OpenLTM provides a comprehensive codebase for developing and evaluating Large Time-Series Models (LTMs). It targets researchers and practitioners in time-series analysis, offering pre-training code, datasets, and scripts for various state-of-the-art models like Timer-XL, Moirai, and GPT4TS, enabling efficient zero-shot and few-shot forecasting.

How It Works

The project leverages foundation backbones, primarily Transformers, and large-scale pre-training to create scalable deep models for time-series data. This approach allows models to generalize across diverse datasets and downstream tasks, offering strong zero-shot forecasting capabilities by learning a "language" of time series.

Quick Start & Requirements

  • Install via pip install -r requirements.txt.
  • Requires Python 3.11.
  • Datasets should be placed in the ./dataset folder. Pre-training datasets include UTSD (1 billion time points) and ERA5-Family. Supervised training datasets are available from TSLib.
  • Scripts for supervised training, pre-training, and adaptation are provided.
  • Official Hugging Face model links are available for Chronos, Moirai, TimesFM, Timer-XL, Time-MoE, and TTMs.

Highlighted Details

  • Implements and provides pre-training code for multiple LTMs including Timer-XL, Moirai, GPT4TS, and TTMs.
  • Includes scripts for supervised training, large-scale pre-training (UTSD, ERA5), and model adaptation (full-shot, few-shot).
  • Supports zero-shot forecasting evaluation with links to pre-trained models on Hugging Face.
  • Offers a pipeline to develop and integrate custom large time-series models.

Maintenance & Community

The project acknowledges contributions from various GitHub repositories and lists key contributors for inquiries. Specific community channels (Discord/Slack) or roadmaps are not explicitly mentioned in the README.

Licensing & Compatibility

The README does not explicitly state a license. However, given the project's nature and common practices in academic research, it is likely intended for research purposes. Commercial use compatibility would require explicit clarification.

Limitations & Caveats

While LTMs are presented as scalable, the README notes they are still smaller than foundation models in other modalities. Pre-training may require significant computational resources (e.g., A100 GPUs), though adaptation can be done on hardware like RTX 4090s. The project is actively updated with new models and implementations.

Health Check
Last commit

1 week ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
5
Star History
99 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.