Back to Search Start Over

Follow Your Path: A Progressive Method for Knowledge Distillation

Authors :
Lei Li
Yuxuan Song
Bohan Li
Wenxian Shi
Hao Zhou
Source :
Machine Learning and Knowledge Discovery in Databases. Research Track ISBN: 9783030865221, ECML/PKDD (3)
Publication Year :
2021
Publisher :
Springer International Publishing, 2021.

Abstract

Deep neural networks often have huge number of parameters, which posts challenges in deployment in application scenarios with limited memory and computation capacity. Knowledge distillation is one approach to derive compact models from bigger ones. However, it has been observed that a converged heavy teacher model is strongly constrained for learning a compact student network and could make the optimization subject to poor local optima. In this paper, we propose ProKT, a new model-agnostic method by projecting the supervision signals of a teacher model into the student’s parameter space. Such projection is implemented by decomposing the training objective into local intermediate targets with approximate mirror descent technique. The proposed method could be less sensitive with the quirks during optimization which could result in a better local optima. Experiments on both image and text datasets show that our proposed ProKT consistently achieves superior performance comparing to other existing knowledge distillation methods.

Details

ISBN :
978-3-030-86522-1
ISBNs :
9783030865221
Database :
OpenAIRE
Journal :
Machine Learning and Knowledge Discovery in Databases. Research Track ISBN: 9783030865221, ECML/PKDD (3)
Accession number :
edsair.doi...........3b1a153c93ac2496bff4c8ae6ecc7d4c