Skip to content
/ overhaul Public

[ICCV 2019] A Comprehensive Overhaul of Feature Distillation

License

Notifications You must be signed in to change notification settings

ZJCV/overhaul

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Language: 🇺🇸 🇨🇳

«overhaul» re-implements the paper A Comprehensive Overhaul of Feature Distillation

arch_s top1 top5 arch_t top1 top5 dataset lambda top1 top5
MobileNetv2 79.420 95.680 ResNet50 83.290 96.630 CIFAR100 15.0 82.440 96.540
ResNet18 80.720 95.840 ResNet50 83.290 96.630 CIFAR100 2.0 82.470 96.360
ResNet18 80.720 95.840 ResNet152 85.660 97.590 CIFAR100 2.0 83.310 97.000
ResNet50 83.290 96.630 ResNet152 85.660 97.590 CIFAR100 2.0 86.080 97.350
ResNet50 83.290 96.630 ResNeXt_32x8d 85.600 97.460 CIFAR100 2.0 85.410 97.430

more see docs

Table of Contents

Background

According to choose new distillation position and design new teacher transfer and distance function, the OFD (Overhaul of Feature Distillation) realizes the better distillation improvement.

Current project implementation is based on clovaai/overhaul-distillation.

Installation

$ pip install -r requirements.txt

Usage

  • Train
$ CUDA_VISIBLE_DEVICES=0 python tools/train.py -cfg=configs/resnet/ofd_2_0_r50_pret_r18_c100_224_e100_sgd_mslr.yaml
  • Test
$ CUDA_VISIBLE_DEVICES=0 python tools/test.py -cfg=configs/resnet/ofd_2_0_r50_pret_r18_c100_224_e100_sgd_mslr.yaml

Maintainers

  • zhujian - Initial work - zjykzj

Thanks

@inproceedings{heo2019overhaul,
  title={A Comprehensive Overhaul of Feature Distillation},
  author={Heo, Byeongho and Kim, Jeesoo and Yun, Sangdoo and Park, Hyojin and Kwak, Nojun and Choi, Jin Young},
  booktitle = {International Conference on Computer Vision (ICCV)},
  year={2019}
}

Contributing

Anyone's participation is welcome! Open an issue or submit PRs.

Small note:

License

Apache License 2.0 © 2021 zjykzj