Back to Search
Start Over
Big2Small: Learning from masked image modelling with heterogeneous self‐supervised knowledge distillation
- Source :
- IET Cyber-systems and Robotics, Vol 6, Iss 4, Pp n/a-n/a (2024)
- Publication Year :
- 2024
- Publisher :
- Wiley, 2024.
-
Abstract
- Abstract Small convolutional neural network (CNN)‐based models usually require transferring knowledge from a large model before they are deployed in computationally resource‐limited edge devices. Masked image modelling (MIM) methods achieve great success in various visual tasks but remain largely unexplored in knowledge distillation for heterogeneous deep models. The reason is mainly due to the significant discrepancy between the transformer‐based large model and the CNN‐based small network. In this paper, the authors develop the first heterogeneous self‐supervised knowledge distillation (HSKD) based on MIM, which can efficiently transfer knowledge from large transformer models to small CNN‐based models in a self‐supervised fashion. Our method builds a bridge between transformer‐based models and CNNs by training a UNet‐style student with sparse convolution, which can effectively mimic the visual representation inferred by a teacher over masked modelling. Our method is a simple yet effective learning paradigm to learn the visual representation and distribution of data from heterogeneous teacher models, which can be pre‐trained using advanced self‐supervised methods. Extensive experiments show that it adapts well to various models and sizes, consistently achieving state‐of‐the‐art performance in image classification, object detection, and semantic segmentation tasks. For example, in the Imagenet 1K dataset, HSKD improves the accuracy of Resnet‐50 (sparse) from 76.98% to 80.01%.
Details
- Language :
- English
- ISSN :
- 26316315
- Volume :
- 6
- Issue :
- 4
- Database :
- Directory of Open Access Journals
- Journal :
- IET Cyber-systems and Robotics
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.91a81d2500354d8ab6cca95ff3bd8f68
- Document Type :
- article
- Full Text :
- https://doi.org/10.1049/csy2.70002