Back to Search Start Over

Efficient Dataset Distillation via Minimax Diffusion

Authors :
Gu, Jianyang
Vahidian, Saeed
Kungurtsev, Vyacheslav
Wang, Haonan
Jiang, Wei
You, Yang
Chen, Yiran
Publication Year :
2023

Abstract

Dataset distillation reduces the storage and computational consumption of training a network by generating a small surrogate dataset that encapsulates rich information of the original large-scale one. However, previous distillation methods heavily rely on the sample-wise iterative optimization scheme. As the images-per-class (IPC) setting or image resolution grows larger, the necessary computation will demand overwhelming time and resources. In this work, we intend to incorporate generative diffusion techniques for computing the surrogate dataset. Observing that key factors for constructing an effective surrogate dataset are representativeness and diversity, we design additional minimax criteria in the generative training to enhance these facets for the generated images of diffusion models. We present a theoretical model of the process as hierarchical diffusion control demonstrating the flexibility of the diffusion process to target these criteria without jeopardizing the faithfulness of the sample to the desired distribution. The proposed method achieves state-of-the-art validation performance while demanding much less computational resources. Under the 100-IPC setting on ImageWoof, our method requires less than one-twentieth the distillation time of previous methods, yet yields even better performance. Source code and generated data are available in https://github.com/vimar-gu/MinimaxDiffusion.<br />Comment: CVPR 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2311.15529
Document Type :
Working Paper