Back to Search Start Over

An experimental comparison of knowledge transfer algorithms in deep neural networks

Authors :
Quinn, Sean
McGuinness, Kevin
Mileo, Alessandra
Quinn, Sean
McGuinness, Kevin
Mileo, Alessandra
Publication Year :
2021

Abstract

Neural knowledge transfer methods aim to constrain the hidden representation of one neural network to be similar, or have similar properties, to another by applying specially designed loss functions between the two networks hidden layers. In this way the intangible knowledge encoded by the network's weights is transferred without having to replicate exact weight structures or alter the knowledge representation from its natural highly distributed form. Motivated by the need to enable greater transparency in evaluating such methods by bridging the gap between different experimental setups in the existing literature, the need to cast a wider net in comparing each method to a greater number of its peers and a desire to explore novel combinations of existing methods we conduct an experimental comparison of eight contemporary neural knowledge transfer algorithms and further explore the performance of some combinations. We conduct our experiments on an image classification task and measure relative performance gains over non-knowledge enhanced baseline neural networks in terms of classification accuracy. We observed (i) some interesting contradictions between our results and those reported in original papers, (ii) a general lack of correlation between any given methods standalone performance vs performance when used in combination with knowledge distillation, (iii) a general trend of older simpler methods outperforming newer ones and (iv) Contrastive Representation Distillation (CRD) achieving best performance.

Details

Database :
OAIster
Notes :
application/pdf, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1390667808
Document Type :
Electronic Resource