Back to Search
Start Over
Gray Learning From Non-IID Data With Out-of-Distribution Samples.
- Source :
-
IEEE transactions on neural networks and learning systems [IEEE Trans Neural Netw Learn Syst] 2023 Nov 14; Vol. PP. Date of Electronic Publication: 2023 Nov 14. - Publication Year :
- 2023
- Publisher :
- Ahead of Print
-
Abstract
- The integrity of training data, even when annotated by experts, is far from guaranteed, especially for non-independent and identically distributed (non-IID) datasets comprising both in-and out-of-distribution samples. In an ideal scenario, the majority of samples would be in-distribution, while samples that deviate semantically would be identified as out-of-distribution and excluded during the annotation process. However, experts may erroneously classify these out-of-distribution samples as in-distribution, assigning them labels that are inherently unreliable. This mixture of unreliable labels and varied data types makes the task of learning robust neural networks notably challenging. We observe that both in-and out-of-distribution samples can almost invariably be ruled out from belonging to certain classes, aside from those corresponding to unreliable ground-truth labels. This opens the possibility of utilizing reliable complementary labels that indicate the classes to which a sample does not belong. Guided by this insight, we introduce a novel approach, termed gray learning (GL), which leverages both ground-truth and complementary labels. Crucially, GL adaptively adjusts the loss weights for these two label types based on prediction confidence levels. By grounding our approach in statistical learning theory, we derive bounds for the generalization error, demonstrating that GL achieves tight constraints even in non-IID settings. Extensive experimental evaluations reveal that our method significantly outperforms alternative approaches grounded in robust statistics.
Details
- Language :
- English
- ISSN :
- 2162-2388
- Volume :
- PP
- Database :
- MEDLINE
- Journal :
- IEEE transactions on neural networks and learning systems
- Publication Type :
- Academic Journal
- Accession number :
- 37962995
- Full Text :
- https://doi.org/10.1109/TNNLS.2023.3330475