Back to Search
Start Over
Densely Distilled Flow-Based Knowledge Transfer in Teacher-Student Framework for Image Classification.
- Source :
- IEEE Transactions on Image Processing; 2020, Vol. 29, p5698-5710, 13p
- Publication Year :
- 2020
-
Abstract
- We propose a new teacher–student framework (TSF)-based knowledge transfer method, in which knowledge in the form of dense flow across layers is distilled from a pre-trained “teacher” deep neural network (DNN) and transferred to another “student” DNN. In the case of distilled knowledge, multiple overlapped flow-based items of information from the pre-trained teacher DNN are densely extracted across layers. Transference of the densely extracted teacher information is then achieved in the TSF using repetitive sequential training from bottom to top between the teacher and student DNN models. In other words, to efficiently transmit extracted useful teacher information to the student DNN, we perform bottom-up step-by-step transfer of densely distilled knowledge. The performance of the proposed method in terms of image classification accuracy and fast optimization is compared with those of existing TSF-based knowledge transfer methods for application to reliable image datasets, including CIFAR-10, CIFAR-100, MNIST, and SVHN. When the dense flow-based sequential knowledge transfer scheme is employed in the TSF, the trained student ResNet more accurately reflects the rich information of the pre-trained teacher ResNet and exhibits superior accuracy to the existing TSF-based knowledge transfer methods for all benchmark datasets considered in this study. [ABSTRACT FROM AUTHOR]
- Subjects :
- IMAGE recognition (Computer vision)
KNOWLEDGE transfer
Subjects
Details
- Language :
- English
- ISSN :
- 10577149
- Volume :
- 29
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Image Processing
- Publication Type :
- Academic Journal
- Accession number :
- 170078356
- Full Text :
- https://doi.org/10.1109/TIP.2020.2984362