Skip to content

This is an official pytorch implementation for paper "Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels" (NeurIPS-23).

Notifications You must be signed in to change notification settings

qianlima-lab/Scale-teaching

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels

This is the training code for our paper "Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels" (NeurIPS-23).

Abstract

Deep Neural Networks (DNNs) have been criticized because they easily overfit noisy (incorrect) labels. To improve the robustness of DNNs, existing methods for image data regard samples with small training losses as correctly labeled data (small-loss criterion). Nevertheless, time series' discriminative patterns are easily distorted by external noises (i.e., frequency perturbations) during the recording process. This results in training losses of some time series samples that do not meet the small-loss criterion. Therefore, this paper proposes a deep learning paradigm called Scale-teaching for combating time series noisy labels. Specifically, we design a fine-to-coarse cross-scale fusion mechanism for learning discriminative patterns by utilizing time series at different scales to train multiple DNNs simultaneously. Meanwhile, each network is trained in a cross-teaching manner by using complementary information from different scales to select small-loss samples as clean labels. For unselected large-loss samples, we introduce multi-scale embedding graph learning via label propagation to correct their labels by using selected clean samples. Experiments on multiple benchmark time series datasets demonstrate the superiority of the proposed Scale-teaching paradigm over state-of-the-art methods in terms of effectiveness and robustness.

Datasets

Four individual large time series datasets

UCR 128 archive time series datasets

UEA 30 archive time series datasets

Usage (Our Model)

To train a Scale-teaching model on a dataset, run

python scale_teaching.py --dataset [name of the dataset you want to train]  ...

Citation

If you use this code for your research, please cite our paper:

@inproceedings{
liu2023scaleteaching,
title={Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels},
author={Zhen Liu and Peitian Ma and Dongliang Chen and Wenbin Pei and Qianli Ma},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023},
url={https://openreview.net/forum?id=9D0fELXbrg}
}

About

This is an official pytorch implementation for paper "Scale-teaching: Robust Multi-scale Training for Time Series Classification with Noisy Labels" (NeurIPS-23).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages