Back to Search Start Over

Label Noise: Ignorance Is Bliss

Authors :
Zhu, Yilun
Zhang, Jianxin
Gangrade, Aditya
Scott, Clayton
Publication Year :
2024

Abstract

We establish a new theoretical framework for learning under multi-class, instance-dependent label noise. This framework casts learning with label noise as a form of domain adaptation, in particular, domain adaptation under posterior drift. We introduce the concept of \emph{relative signal strength} (RSS), a pointwise measure that quantifies the transferability from noisy to clean posterior. Using RSS, we establish nearly matching upper and lower bounds on the excess risk. Our theoretical findings support the simple \emph{Noise Ignorant Empirical Risk Minimization (NI-ERM)} principle, which minimizes empirical risk while ignoring label noise. Finally, we translate this theoretical insight into practice: by using NI-ERM to fit a linear classifier on top of a self-supervised feature extractor, we achieve state-of-the-art performance on the CIFAR-N data challenge.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.00079
Document Type :
Working Paper