1. Multi-teacher Contrastive Knowledge Inversion for Data-Free Distillation
- Author
-
LIN Zhenyuan, LIN Shaohui, YAO Yiwu, HE Gaoqi, WANG Changbo, MA Lizhuang
- Subjects
model compression ,data-free ,knowledge distillation ,data protection ,privacy protection ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Knowledge distillation is an effective method for model compression with access to training data. However, due to privacy, confidentiality, or transmission limitations, people cannot get the support of data. Existing data-free knowledge distillation methods only use biased feature statistics contained in one model and run into pro-blems with low generalizability and diversity in synthetic images and unsatisfactory student model performance. To address these problems, this paper proposes a multi-teacher contrastive knowledge inversion (MTCKI) method that extracts and fuses model-specific knowledge from the available teacher models into a student model to eliminate model bias. Further, this paper improves the diversity of synthesized images using contrastive learning, which encourages the synthetic images to be distinguishable from the previously stored images. Meanwhile, this paper proposes the strategy of contrastive loss based on multi-teacher and student to improve the feature representation ability of student network. Experiments demonstrate that MTCKI not only can generate visually satisfactory images but also outperforms existing state-of-the-art approaches. The resulting synthesized images are much closer to the distribution of the original dataset and can be generated only once to provide comprehensive guidance for various networks rather than a specific one.
- Published
- 2023
- Full Text
- View/download PDF