Skip to content

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting #66

Open
@jinglescode

Description

@jinglescode

Paper

Link: https://arxiv.org/pdf/2012.07436.pdf
Year: 2021

Summary

  • reduce space complexity: query sparsity measurement
  • reduce time complexity: ProbSparse
  • predict sequence in one batch: generative style decoder (decoder generates long sequences with 1 forward pass)

Methods

image

Results

  • same time and space complexity and reformer, but only 1 for inference
  • significantly better results than RNN/LSTM
  • outperform Reformer
  • achieves better results than DeepAR, ARIMA and Prophet on MSE

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions