Skip to content

Official code and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)

License

Notifications You must be signed in to change notification settings

albertcao/Large-Time-Series-Model

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Timer (Large Time Series Model)

This repo provides official code and checkpoints for Timer: Generative Pre-trained Transformers Are Large Time Series Models.

Updates

🚩 News (2024.6) Pre-training dataset (UTSD) is available in HuggingFace!

🚩 News (2024.5) Accepted by ICML 2024, a camera-ready version of 31 pages.

🚩 News (2024.4) The pre-training scale has been extended, enabling zero-shot forecasting.

🚩 News (2024.2) Releasing model checkpoints and code for adaptation.

Introduction

Time Series Transformer (Timer) is a Generative Pre-trained Transformer for general time series analysis. You can visit our Homepage for a more detailed introduction.

Datasets

We curate Unified Time Series Datasets (UTSD) comprised of 1B time points and 4 volumes to facilitate the research on large time series models and pre-training.

Tasks

Forecasting: We provide all scripts as well as datasets for few-shot forecasting in this repo.

Imputation: We propose segment-level imputation, which is more challenging than point-level imputation.

Anomaly Detection: We provide new benchmarks of predictive anomaly detection on UCR Anomaly Archive.

Code for Fine-tuning

  1. Install Pytorch and necessary dependencies.
pip install -r requirements.txt
  1. Put the datasets from Google Drive and Tsinghua Cloud under the folder ./dataset/.

  2. Put the checkpoint from Google Drive and Tsinghua Cloud under the folder ./checkpoints/.

  3. Train and evaluate the model. We provide the above tasks under the folder ./scripts/.

# forecasting
bash ./scripts/forecast/ECL.sh

# segement-level imputation
bash ./scripts/imputation/ECL.sh

# anomaly detection
bash ./scripts/anomaly_detection/UCR.sh

We also provide detailed README files for each task in folder ./scripts/.

Approach

Pre-training and Adaptation

To pre-train on heterogeneous time series, we propose single-series sequence (S3), reserving series variations with the unified context length. Further, we convert forecasting, imputation, and anomaly detection into a unified generative task.

Model Architecture

Given the limited exploration of the backbone for large time series models, we extensively evaluate candidate backbones and adopt the decoder-only Transformer with autoregressive generation towards LTSMs.

Performance

Timer achieves state-of-the-art performance in each task and we present the pre-training benefit on few-shot scenarios.

Scalability

By increasing the parameters and pre-training scale, Timer achieves notable performance improvement: 0.231 $\to$ 0.138 (−40.3%), surpassing the previous state-of-the-art deep forecasters.

300

Flexible Sequence Length

The decoder-only architecture provides the flexibility to accommodate time series of different lookback and forecast lengths.

300

Showcases

Forecasting under data scarcity

Imputation with few-shot samples

Anomaly detection on UCR Anomaly Archive

Future Work

We are preparing to provide the online service for zero-shot forecasting. Please stay tuned for the update!

Citation

If you find this repo helpful, please cite our paper.

@article{liu2024timer,
 title={Timer: Transformers for Time Series Analysis at Scale},
 author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
 journal={arXiv preprint arXiv:2402.02368},
 year={2024} 
}

Acknowledgement

We appreciate the following GitHub repos a lot for their valuable code and efforts.

Contact

If you have any questions or want to use the code, feel free to contact:

About

Official code and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 92.9%
  • Shell 7.1%