Back to Search Start Over

Leveraging Hierarchical Feature Sharing for Efficient Dataset Condensation

Authors :
Zheng, Haizhong
Sun, Jiachen
Wu, Shutong
Kailkhura, Bhavya
Mao, Zhuoqing
Xiao, Chaowei
Prakash, Atul
Source :
ECCV 2024
Publication Year :
2023

Abstract

Given a real-world dataset, data condensation (DC) aims to synthesize a small synthetic dataset that captures the knowledge of a natural dataset while being usable for training models with comparable accuracy. Recent works propose to enhance DC with data parameterization, which condenses data into very compact parameterized data containers instead of images. The intuition behind data parameterization is to encode shared features of images to avoid additional storage costs. In this paper, we recognize that images share common features in a hierarchical way due to the inherent hierarchical structure of the classification system, which is overlooked by current data parameterization methods. To better align DC with this hierarchical nature and encourage more efficient information sharing inside data containers, we propose a novel data parameterization architecture, Hierarchical Memory Network (HMN). HMN stores condensed data in a three-tier structure, representing the dataset-level, class-level, and instance-level features. Another helpful property of the hierarchical architecture is that HMN naturally ensures good independence among images despite achieving information sharing. This enables instance-level pruning for HMN to reduce redundant information, thereby further minimizing redundancy and enhancing performance. We evaluate HMN on five public datasets and show that our proposed method outperforms all baselines.

Details

Database :
arXiv
Journal :
ECCV 2024
Publication Type :
Report
Accession number :
edsarx.2310.07506
Document Type :
Working Paper