Skip to content

sithu31296/Knowledge-Distillation-Pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FastKD

Introduction

PyTorch Knowledge Distillation Framework.

Features

Datasets:

Sample Model:

  • Teacher: ResNet50 (from torchvision)
  • Student: ResNet18 (from torchvision)

KD Methods:

Features coming soon:

Methods Comparison

Coming Soon...

Configuration

Create a configuration file in configs. Sample configuration for ImageNet dataset can be found here. Then edit the fields you think if it is needed. This configuration file is needed for both training and evaluation scripts.

Training

$ python train.py --cfg configs/CONFIG_FILE_NAME.yaml

Evaluation

$ python val.py --cfg configs/CONFIG_FILE_NAME.yaml

About

PyTorch Knowledge Distillation Framework

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages