Research paper code for zero-shot time series forecasting with LLMs
Top 45.3% on sourcepulse
LLMTime enables zero-shot time series forecasting by encoding numerical data as text and leveraging Large Language Models (LLMs) for extrapolation. It targets researchers and practitioners seeking to forecast time series without task-specific training, offering competitive performance against traditional methods, especially with powerful base LLMs.
How It Works
LLMTime represents time series data as textual prompts, allowing LLMs to predict future values through text completion. This approach bypasses the need for traditional model training on target datasets. The method's effectiveness scales with LLM capabilities, though aligned models like GPT-4 may underperform compared to base models like GPT-3 due to differences in their output generation characteristics.
Quick Start & Requirements
source install.sh
(creates a conda environment llmtime
).install.sh
for other CUDA versions).export OPENAI_API_KEY=<your key>
.export MISTRAL_KEY=<your key>
.demo.ipynb
requires no GPUs.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
RLHF-aligned models may show degraded performance compared to base models. The README suggests specific temperature tuning for gpt-3.5-turbo-instruct
.
11 months ago
1 week