Back to Search Start Over

Learning to Augment Distributions for Out-of-Distribution Detection

Authors :
Wang, Q
Fang, Z
Zhang, Y
Liu, F
Li, Y
Han, B
Wang, Q
Fang, Z
Zhang, Y
Liu, F
Li, Y
Han, B
Publication Year :
2023

Abstract

Open-world classification systems should discern out-of-distribution (OOD) data whose labels deviate from those of in-distribution (ID) cases, motivating recent studies in OOD detection. Advanced works, despite their promising progress, may still fail in the open world, owing to the lack of knowledge about unseen OOD data in advance. Although one can access auxiliary OOD data (distinct from unseen ones) for model training, it remains to analyze how such auxiliary data will work in the open world. To this end, we delve into such a problem from a learning theory perspective, finding that the distribution discrepancy between the auxiliary and the unseen real OOD data is the key to affecting the open-world detection performance. Accordingly, we propose Distributional-Augmented OOD Learning (DAL), alleviating the OOD distribution discrepancy by crafting an OOD distribution set that contains all distributions in a Wasserstein ball centered on the auxiliary OOD distribution. We justify that the predictor trained over the worst OOD data in the ball can shrink the OOD distribution discrepancy, thus improving the open-world detection performance given only the auxiliary OOD data. We conduct extensive evaluations across representative OOD detection setups, demonstrating the superiority of our DAL over its advanced counterparts. The code is publicly available at: https://github.com/tmlr-group/DAL.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1439659260
Document Type :
Electronic Resource