Skip to content

Latest commit

 

History

History
82 lines (59 loc) · 3.18 KB

README.md

File metadata and controls

82 lines (59 loc) · 3.18 KB

Periodicity Decoupling Framework for Long-term Series Forecasting 🚀

📰 News

  • [2024-01-15] 🎉 Our paper has been accepted by ICLR 2024.

🌟 Results

Quantitatively, compared with Transformer-based models, PDF(720) yields an overall 14.59% reduction in MSE and 10.77% reduction in MAE. Compared with CNN-based models, PDF(720) yields an overall 24.61% reduction in MSE and 19.91% reduction in MAE. Compared with Linear-based models, PDF(720) yields an overall 7.05% reduction in MSE and 5.51% reduction in MAE.

🛠 Prerequisites

Ensure you are using Python 3.9 and install the necessary dependencies by running:

pip install -r requirements.txt

📊 Prepare Datastes

Begin by downloading the required datasets. All datasets are conveniently available at Autoformer. Create a separate folder named ./dataset and neatly organize all the csv files as shown below:

dataset
└── electricity.csv
└── ETTh1.csv
└── ETTh2.csv
└── ETTm1.csv
└── ETTm2.csv
└── traffic.csv
└──  weather.csv

💻 Training

All scripts are located in ./scripts/PDF. Choose from two historical input lengths: 336 and 720. For instance, to train a model using the ETTh2 dataset with an input length of 720, simply run:

sh ./scripts/PDF/720/ETTh2.sh

After training:

  • Your trained model will be safely stored in ./checkpoints.
  • Visualization outputs are available in ./test_results.
  • Numerical results in .npy format can be found in ./results.
  • A comprehensive summary of quantitative metrics is accessible in ./results.txt.

📚 Citation

If you find this repo useful, please consider citing our paper as follows:

@article{dai2024period,
  title={Periodicity Decoupling Framework for Long-term Series Forecasting},
  author={Dai, Tao and Wu, Beiliang and Liu, Peiyuan and Li, Naiqi and Bao, Jigang and Jiang, Yong and Xia, Shu-Tao},
  journal={International Conference on Learning Representations},
  year={2024}
}

🙏 Acknowledgement

Special thanks to the following repositories for their invaluable code and datasets:

📩 Contact

If you have any questions, please contact [email protected] or submit an issue.