Back to Search Start Over

Generalization vs. Specialization under Concept Shift

Authors :
Nguyen, Alex
Schwab, David J.
Ngampruetikorn, Vudtiwat
Publication Year :
2024

Abstract

Machine learning models are often brittle under distribution shift, i.e., when data distributions at test time differ from those during training. Understanding this failure mode is central to identifying and mitigating safety risks of mass adoption of machine learning. Here we analyze ridge regression under concept shift -- a form of distribution shift in which the input-label relationship changes at test time. We derive an exact expression for prediction risk in the high-dimensional limit. Our results reveal nontrivial effects of concept shift on generalization performance, depending on the properties of robust and nonrobust features of the input. We show that test performance can exhibit a nonmonotonic data dependence, even when double descent is absent. Finally, our experiments on MNIST and FashionMNIST suggest that this intriguing behavior is present also in classification problems.<br />Comment: 8 pages, 3 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.15582
Document Type :
Working Paper