This is an unofficial reproduction of autoformer. Official code is here.
Depenedence:pytorch,einops
git clone https://github.com/celtics1863/autoformer_pytorch
cd autoformer_pytorch
python ETTmtrain.py
config settings:
- datasets: ETTm
- pred_len: 96
- label_len:96
- input_len:96
model | mae | rmse |
---|---|---|
informer (official) | 0.360 | 0.239 |
autoformer (official) | 0.301 | 0.218 |
autoformer (ours) | 0.291 | 0.175 |
- If no time stamp inputs, use positional embeddings
- We set label_len == pred_len
- We guess autoformer will be better used in auto-regression problem for decoder structure use input trend as hint.
- Temporal Embeddings are usefull.
- Moving Average is more usefull than pooling.
@inproceedings{wu2021autoformer,
title={Autoformer: Decomposition Transformers with {Auto-Correlation} for Long-Term Series Forecasting},
author={Haixu Wu and Jiehui Xu and Jianmin Wang and Mingsheng Long},
booktitle={Advances in Neural Information Processing Systems},
year={2021}
}