Skip to content

vigneashpandiyan/autoformer_pytorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

autoformer_pytorch

This is an unofficial reproduction of autoformer. Official code is here.

Depenedence:pytorch,einops

run

git clone https://github.com/celtics1863/autoformer_pytorch
cd autoformer_pytorch
python ETTmtrain.py

Results:

config settings:

  • datasets: ETTm
  • pred_len: 96
  • label_len:96
  • input_len:96
model mae rmse
informer (official) 0.360 0.239
autoformer (official) 0.301 0.218
autoformer (ours) 0.291 0.175

Defference and Idea

  1. If no time stamp inputs, use positional embeddings
  2. We set label_len == pred_len
  3. We guess autoformer will be better used in auto-regression problem for decoder structure use input trend as hint.
  4. Temporal Embeddings are usefull.
  5. Moving Average is more usefull than pooling.

Thanks

@inproceedings{wu2021autoformer,
  title={Autoformer: Decomposition Transformers with {Auto-Correlation} for Long-Term Series Forecasting},
  author={Haixu Wu and Jiehui Xu and Jianmin Wang and Mingsheng Long},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

About

autoformer unofficial reproduction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%