Back to Search Start Over

Simplifying Neural Network Training Under Class Imbalance

Authors :
Shwartz-Ziv, Ravid
Goldblum, Micah
Li, Yucen Lily
Bruss, C. Bayan
Wilson, Andrew Gordon
Publication Year :
2023

Abstract

Real-world datasets are often highly class-imbalanced, which can adversely impact the performance of deep learning models. The majority of research on training neural networks under class imbalance has focused on specialized loss functions, sampling techniques, or two-stage training procedures. Notably, we demonstrate that simply tuning existing components of standard deep learning pipelines, such as the batch size, data augmentation, optimizer, and label smoothing, can achieve state-of-the-art performance without any such specialized class imbalance methods. We also provide key prescriptions and considerations for training under class imbalance, and an understanding of why imbalance methods succeed or fail.<br />Comment: NeurIPS 2023. Code available at https://github.com/ravidziv/SimplifyingImbalancedTraining

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.02517
Document Type :
Working Paper