Dateformer: Time-modeling Transformer for Long-term Series Forecasting
To install requirements:
pip install -r requirements.txt
To reproduce the results in the paper, run this command:
bash ./scripts/experiments.sh
We experiment on 7 datasets, covering 4 main-stream applications. We compare our model with 6 baselines, including FEDformer, Autoformer, Informer, Pyraformer, etc. Generally, for the all setups on all datasets, Dateformer achieves the SOTA performance, with a 33.6% relative improvement over previous baselines.
We appreciate the following github repos a lot for their valuable code base or datasets:
https://github.com/DAMO-DI-ML/ICML2022-FEDformer
https://github.com/zhouhaoyi/Informer2020
https://github.com/zhouhaoyi/ETDataset