Request for Knowledge Distillation example #226
Replies: 1 comment 4 replies
-
Hi @mountains-high , Just a disclaimer, I am not very familiar with this paper, and probably you are more familiar with the paper than I am . e.g., To help you reimplement their approach with torchdistill, I need to understand what modules in models are trained and how (input-output, loss functions, etc). P.S. |
Beta Was this translation helpful? Give feedback.
-
Hello, Thank you for the great work.
I'd like to ask for your assistance in implementing some of the experiments from https://arxiv.org/abs/1912.08795
For CIFAR-10 KD, ResNet-34, VGG-11 as a teacher network.
ResNet-18, VGG-11 as a student network.
Table 1 and Figure 4, could you show one example, please?
Thank you very much for your time and consideration.
Beta Was this translation helpful? Give feedback.
All reactions