A large scale study of Knowledge Distillation.
-
Updated
Apr 19, 2020 - Python
A large scale study of Knowledge Distillation.
[CVPR 2021] Distilling Knowledge via Knowledge Review
[ICCV 2019] A Comprehensive Overhaul of Feature Distillation
Official implementation of "Intra-Class Similarity-Guided Feature Distillation" accepted in NeurIPS-ENLSP 2023
Add a description, image, and links to the feature-distillation topic page so that developers can more easily learn about it.
To associate your repository with the feature-distillation topic, visit your repo's landing page and select "manage topics."