Awesome Knowledge Distillation
-
Updated
Oct 14, 2024
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
[ECCV2022] Factorizing Knowledge in Neural Networks
Multiple Generation Based Knowledge Distillation: A Roadmap
Official implementation of "Learn From One Specialized Sub-Teacher: One-to-One Mapping for Feature-Based Knowledge Distillation" accepted in EMNLP-Findings 2023
Code Reproduction of the essay Distillation Decision Tree
Add a description, image, and links to the knowldge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowldge-distillation topic, visit your repo's landing page and select "manage topics."