This repo provides official code and checkpoints for Timer: Transformers for Time Series Analysis at Scale.
🚩 News (2024.3) Checkpoint pre-trained on UTSD-4G is available.
🚩 News (2024.3) We provide the downstream fine-tune code for the forecasting task.
Time Series Transformer (Timer) includes GPT-style Transformers pre-trained on multi-domain time series as Large Time Series Model (LTSM).
We curate large-scale datasets comprised of 1B time points, proposing a unified training strategy with single-series sequence, and presenting Timer with the decoder-only architecture. As a LTSM, Timer is enabled with:
-
Generalization ability that one model fits all domains.
-
Task generality that one model copes with various tasks.
-
Scalability that the performance increases with the scale of pre-training.
- Install Pytorch and necessary dependencies.
pip install -r requirements.txt
-
The datasets can be obtained from Google Drive or Tsinghua Cloud.
-
Download the pre-trained checkpoints and put them under the folder
./checkpoints/
.- Timer_67M_UTSD_4G [Google] [Tsinghua]
-
Train and evaluate the model. We provide the above tasks under the folder
./scripts/
.
# forecasting
bash ./scripts/forecast/ECL.sh
# TODO: segement-level imputation
bash ./scripts/imputation/ECL.sh
# TODO: anomaly detection on the fly
bash ./scripts/anomaly_detection/UCR.sh
Forecasting with data scarcity (limited downstream training samples)
Segment-level imputation with few-shot samples
On-the-fly anomaly detection on UCR Anomaly Archive
We curate Unified Time Series Dataset (UTSD) that includes 7 domains with up to 1 billion time points with hierarchical capacities to facilitate research of scalability and domain transfer.
To facilitate pre-training on extensive time series, we convert heterogeneous series into single-series sequence (S3), reserving the patterns of series variations with the unified context length, towards the well-established tokenization like natural language.
With the substantial progress of decode-only large language models and evaluation of other backbone alternatives, we adopt the GPT-style Transformer with autoregressive generation towards LTSMs.
Timer is applicable on various tasks, which is realized in the unified generative approach.
Task | Formulation |
---|---|
Time Series Forecasting | Next Token Prediction |
Imputation (Segment-level) | Denoising Autoencoding |
Anomaly Detection (Predictive) | Next Token Prediction |
We compare Timer with state-of-the-art approaches and present the pre-training benefit on data-scarce scenarios, known as the few-shot cpability of large models.
By increasing the parameters and pre-training scale, Timer achieves notable performance improvement: 0.231
The decoder-only architecture provides additional flexibility to accommodate time series of different lookback and forecast lengths.
We are preparing to release the composition of datasets (UTSD), larger checkpoints, and code for pre-training. Please stay tuned for the update!
If you find this repo helpful, please cite our paper.
@article{liu2024timer,
title={Timer: Transformers for Time Series Analysis at Scale},
author={Liu, Yong and Zhang, Haoran and Li, Chenyu and Huang, Xiangdong and Wang, Jianmin and Long, Mingsheng},
journal={arXiv preprint arXiv:2402.02368},
year={2024}
}
We appreciate the following GitHub repos a lot for their valuable code and efforts.
- Time-Series-Library (https://github.com/thuml/Time-Series-Library)
- iTransformer (https://github.com/thuml/iTransformer)
If you have any questions or want to use the code, feel free to contact:
- Yong Liu (liuyong21@mails.tsinghua.edu.cn)
- Haoran Zhang (z-hr20@mails.tsinghua.edu.cn)
- Chenyu Li (lichenyu20@mails.tsinghua.edu.cn)