moment  by moment-timeseries-foundation-model

Time-series foundation model for general-purpose time-series analysis

created 1 year ago
572 stars

Top 57.2% on sourcepulse

GitHubView on GitHub
Project Summary

MOMENT is a family of open-source foundation models designed for general-purpose time-series analysis, targeting researchers and practitioners. It addresses challenges in large-scale time-series pre-training by introducing the Time-series Pile, a diverse dataset collection, and a benchmark for evaluating models under limited supervision, offering improved performance on imputation, anomaly detection, classification, and forecasting tasks.

How It Works

MOMENT employs a patching strategy, dividing time-series into fixed-length sub-sequences that are then mapped to patch embeddings. During pre-training, random patches are masked and the model learns to reconstruct the original time-series using a lightweight reconstruction head. This approach allows MOMENT to capture subtle temporal characteristics like trend, scale, frequency, and phase, and learn meaningful representations even without task-specific fine-tuning.

Quick Start & Requirements

  • Install via pip: pip install momentfm or pip install git+https://github.com/moment-timeseries-foundation-model/moment.git
  • Recommended Python version: 3.11
  • Tutorials and experiments can be reproduced on a single NVIDIA A6000 GPU with 48 GiB RAM.
  • Official documentation and tutorials: https://github.com/moment-timeseries-foundation-model/moment

Highlighted Details

  • Outperforms statistical imputation baselines and is competitive with other methods in anomaly detection, classification, and short-horizon forecasting without fine-tuning.
  • Achieves state-of-the-art results on anomaly detection and competitive performance in long-horizon forecasting via linear probing.
  • Embeddings capture nuanced time-series properties like trend, scale, frequency, and phase.
  • Demonstrates distinct class representations for datasets like ECG5000 without specific fine-tuning.

Maintenance & Community

  • The project is associated with Auton Lab at Carnegie Mellon University.
  • Active development is indicated by recent releases of small and base model versions and research code.
  • Contributions are encouraged, with guidelines to be provided.
  • Related work includes JoLT for multimodal time-series and text foundation models.

Licensing & Compatibility

  • Licensed under the MIT License, permitting commercial use and closed-source linking.

Limitations & Caveats

  • While the research code is available, it is noted as "messier" than the released package.
  • Support for Python versions other than 3.11 is expected but not guaranteed at the time of the README.
Health Check
Last commit

2 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
69 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.