Open codebase for large time-series model development and evaluation
Top 74.7% on sourcepulse
OpenLTM provides a comprehensive codebase for developing and evaluating Large Time-Series Models (LTMs). It targets researchers and practitioners in time-series analysis, offering pre-training code, datasets, and scripts for various state-of-the-art models like Timer-XL, Moirai, and GPT4TS, enabling efficient zero-shot and few-shot forecasting.
How It Works
The project leverages foundation backbones, primarily Transformers, and large-scale pre-training to create scalable deep models for time-series data. This approach allows models to generalize across diverse datasets and downstream tasks, offering strong zero-shot forecasting capabilities by learning a "language" of time series.
Quick Start & Requirements
pip install -r requirements.txt
../dataset
folder. Pre-training datasets include UTSD (1 billion time points) and ERA5-Family. Supervised training datasets are available from TSLib.Highlighted Details
Maintenance & Community
The project acknowledges contributions from various GitHub repositories and lists key contributors for inquiries. Specific community channels (Discord/Slack) or roadmaps are not explicitly mentioned in the README.
Licensing & Compatibility
The README does not explicitly state a license. However, given the project's nature and common practices in academic research, it is likely intended for research purposes. Commercial use compatibility would require explicit clarification.
Limitations & Caveats
While LTMs are presented as scalable, the README notes they are still smaller than foundation models in other modalities. Pre-training may require significant computational resources (e.g., A100 GPUs), though adaptation can be done on hardware like RTX 4090s. The project is actively updated with new models and implementations.
1 week ago
1 week