Welcome to the official repository of the SegRNN paper: "Segment Recurrent Neural Network for Long-Term Time Series Forecasting."
SegRNN is an innovative RNN-based model designed for Long-term Time Series Forecasting (LTSF). It incorporates two fundamental strategies:
- The replacement of point-wise iterations with segment-wise iterations
- The substitution of Recurrent Multi-step Forecasting (RMF) with Parallel Multi-step Forecasting (PMF)
By combining these two strategies, SegRNN achieves state-of-the-art results with just a single layer of GRU, making it extremely lightweight and efficient.
Lots of readers have inquired about why there is a significant difference between the MSE and MAE metrics for Traffic data in the paper. This is because the presence of outlier extreme values in the Traffic data amplifies the MSE error. After adopting the mainstream ReVIN strategy, this issue was resolved, and the forecast accuracy was further improved.
To get started, ensure you have Conda installed on your system and follow these steps to set up the environment:
conda create -n SegRNN python=3.8
conda activate SegRNN
pip install -r requirements.txt
All the datasets needed for SegRNN can be obtained from the Google Drive provided in Autoformer.
Create a separate folder named ./dataset
and place all the CSV files in this directory.
Note: Place the CSV files directly into this directory, such as "./dataset/ETTh1.csv"
You can easily reproduce the results from the paper by running the provided script command. For instance, to reproduce the main results, execute the following command:
sh run_main.sh
Similarly, you can specify separate scripts to run independent tasks, such as obtaining results on etth1:
sh scripts/SegRNN/etth1.sh
You can reproduce the results of the ablation learning by using other instructions:
sh scripts/SegRNN/ablation/rnn_variants.sh
If you find this repo useful, please cite our paper.
@misc{lin2023segrnn,
title={SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting},
author={Shengsheng Lin and Weiwei Lin and Wentai Wu and Feiyu Zhao and Ruichao Mo and Haotong Zhang},
year={2023},
eprint={2308.11200},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
We extend our heartfelt appreciation to the following GitHub repositories for providing valuable code bases and datasets:
https://github.com/yuqinie98/patchtst
https://github.com/cure-lab/LTSF-Linear
https://github.com/zhouhaoyi/Informer2020
https://github.com/thuml/Autoformer
https://github.com/MAZiqing/FEDformer
https://github.com/alipay/Pyraformer