This is a DEMO. This Time-series Prediction Benchmark is under the leadership of Professor Lei CHEN. Great thanks to TSlib.
🎉 NEWS:
- 2023-10-27 Support the Short-term-forcasting task.
- 2023-10-16 Add several benchmark datasets.
- 2023-10-09 Support ETSformer.
- 2023-09-30 Support Autoformer.
- 2023-09-23 Support Pyraformer.
- 2023-09-22 Support Informer.
- 2023-09-15 Support Reformer.
- 2023-09-02 Support Transformer.
- 2023-08-28 Support the Long-term-forcasting task.
- 2023-08-27 Finished Dataloader
.
├── ckpts # Models will be saved here during training.
├── data # Benchmark dataset.
├── dataloader # Dataloader.
├── exp # Long-term and short-term forecasting tasks
├── layer # These codes are modified from mindspore.nn.layer, including the internal implementations of the model.
├── model # Framework of the models.
├── results # Results.
├── utils # Other functions.
├── EDSR # The implementation of Effective Data Selection and Replay for Unsupervised Continual Learning (EDSR).
├── exp_.py # Main function
└── README.md
For now, we only support CPU version MindSpore. We are executing the code using Python 3.8 on a Windows x64 platform. You can refer to here. run:
pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/2.1.1/MindSpore/cpu/x86_64/mindspore-2.1.1-cp38-cp38-win_amd64.whl --trusted-host ms-release.obs.cn-north-4.myhuaweicloud.com -i https://pypi.tuna.tsinghua.edu.cn/simple
Please download the data from the Google Drive.
python exp.py \
--model 'Transformer' \
--train_epoch 10
Other parameters
model
: Could choose from ['Transformer', 'Informer', 'Reformer', 'Pyraformer', 'Autoformer', 'ETSformer'].patience
: For early stop.batch_size
: Batch_size.learning_rate
: Learning rate of the optimizer.features
: Forecasting task, options:[M, S, MS]; M:multivariate predict multivariate, S:univariate predict univariate, MS:multivariate predict univariate.enc_in
: Encoder input size.dec_in
: Decoder input size.c_out
: Output size.dropout
: Dropout rate.embed
: time features encoding, options:[timeF, fixed, learned].task_name
: task name, options:[short_term_forecast, short_term_forecast].label_len
: start token length.pred_len
: prediction sequence length.
For long-term forecasting tasks:
Dataset | ETTh | Exchange_rate | Electricity | National_illness | Traffic | Weather | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
args.features | 'M' | 'S' | 'MS' | 'M' | 'S' | 'MS' | 'M' | 'S' | 'MS' | 'M' | 'S' | 'MS' | 'M' | 'S' | 'MS' | 'M' | 'S' | 'MS' |
args.target | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' | 'OT' |
args.enc_in | 7 | 1 | 7 | 8 | 1 | 8 | 321 | 1 | 321 | 7 | 1 | 7 | 862 | 1 | 862 | 21 | 1 | 21 |
args.dec_in | 7 | 1 | 7 | 8 | 1 | 8 | 321 | 1 | 321 | 7 | 1 | 7 | 862 | 1 | 862 | 21 | 1 | 21 |
args.c_out | 7 | 1 | 7 | 8 | 1 | 1 | 321 | 1 | 1 | 7 | 1 | 1 | 862 | 1 | 1 | 21 | 1 | 1 |
For short-term forecasting tasks, we support four models: Transformer, Autoformer, Pyraformer, and Informer. Please configure args.features as 'S'.
- For Reformer, there is no CPU-based Mindspore equivalent of the PyTorch torch.einsum() function. Consequently, we continue to utilize the PyTorch version of this function in our code for its superior performance.(layers/reformer_attn.py) If you prefer not to use PyTorch, we also offer our own custom time-inefficient function, which can be found in the commented-out code at the same location.
- For Autoformer, ops.roll does not support CPU, and therefore we use the numpy instead.(layers/autoformer_attn.py)
- For ETSformer,
- since the gradient is not supported for complex type multiplication currently, we have to do multiplication with nd.array format.(layers/autoformer_attn.py, layers/etsformer_attn.py)
- since the mindspore.ops.FFTWithSize is not same as the torch.rfft/irfft, we use numpy instead.(layers/etsformer_attn.py)
- For now, we only provide long-term-forcast-task. We will support short-term-forcast-term in the future.