Back to Search Start Over

Towards Effective Data-Free Knowledge Distillation via Diverse Diffusion Augmentation

Authors :
Li, Muquan
Zhang, Dongyang
He, Tao
Xie, Xiurui
Li, Yuan-Fang
Qin, Ke
Publication Year :
2024

Abstract

Data-free knowledge distillation (DFKD) has emerged as a pivotal technique in the domain of model compression, substantially reducing the dependency on the original training data. Nonetheless, conventional DFKD methods that employ synthesized training data are prone to the limitations of inadequate diversity and discrepancies in distribution between the synthesized and original datasets. To address these challenges, this paper introduces an innovative approach to DFKD through diverse diffusion augmentation (DDA). Specifically, we revise the paradigm of common data synthesis in DFKD to a composite process through leveraging diffusion models subsequent to data synthesis for self-supervised augmentation, which generates a spectrum of data samples with similar distributions while retaining controlled variations. Furthermore, to mitigate excessive deviation in the embedding space, we introduce an image filtering technique grounded in cosine similarity to maintain fidelity during the knowledge distillation process. Comprehensive experiments conducted on CIFAR-10, CIFAR-100, and Tiny-ImageNet datasets showcase the superior performance of our method across various teacher-student network configurations, outperforming the contemporary state-of-the-art DFKD methods. Code will be available at:https://github.com/SLGSP/DDA.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.17606
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3664647.3680711