Skip to content

Code for the paper: On Symmetric Losses for Learning from Corrupted Labels

License

Notifications You must be signed in to change notification settings

nolfwin/symloss-ber-auc

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

On Symmetric Losses for Learning from Corrupted Labels [ICML'19]

Code for the paper: On Symmetric Losses for Learning from Corrupted Labels Authors: Nontawat Charoenphakdee, Jongyeong Lee, Masashi Sugiyama

Implementation for experiments with CIFAR-10 and MNIST

Paper link: ArXiv

Usage

  • install dependency : Python 3.5+, Pytorch 1.0, numpy, PIL, sklearn.
  • run python main.py --data cifar-10 --epoch 50 --prior 0.65 --opt auc, which is default.
  • data : MNIST (Odd vs Even), CIFAR-10 (Airplane vs Horse)
  • prior : one of (1.0, 0.0), (0.8, 0.3), (0.7, 0.4), (0.65, 0.45)
  • opt : BER (balanced error rate minimization), AUC (area under the receiver operating characteristic curve maximization)

Reference

[1] Nontawat Charoenphakdee, Jongyeong Lee, and Masashi Sugiyama. "On Symmetric Losses for Learning from Corrupted Labels." In Proceedings of 36th International Conference on Machine Learning (ICML2019), Proceedings of Machine Learning Research, Long Beach, California, USA, Jun. 9-15, 2019.

About

Code for the paper: On Symmetric Losses for Learning from Corrupted Labels

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages