Back to Search
Start Over
Learning Accurate Pseudo-Labels via Feature Similarity in the Presence of Label Noise.
- Source :
- Applied Sciences (2076-3417); Apr2024, Vol. 14 Issue 7, p2759, 16p
- Publication Year :
- 2024
-
Abstract
- Due to the exceptional learning capabilities of deep neural networks (DNNs), they continue to struggle to handle label noise. To address this challenge, the pseudo-label approach has emerged as a preferred solution. Recent works have achieved significant improvements by exploring the information involved in DNN predictions and designing a straightforward method to incorporate model predictions into the training process by using a convex combination of original labels and predictions as the training targets. However, these methods overlook the feature-level information contained in the sample, which significantly impacts the accuracy of the pseudo label. This study introduces a straightforward yet potent technique named FPL (feature pseudo-label), which leverages information from model predictions as well as feature similarity. Additionally, we utilize an exponential moving average scheme to bolster the stability of corrected labels while upholding the stability of pseudo-labels. Extensive experiments were carried out on synthetic and real datasets across different noise types. The CIFAR10 dataset yielded the highest accuracy of 94.13% (Top1), while Clothing1M achieved 73.54%. The impressive outcomes showcased the efficacy and robustness of learning amid label noise. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 20763417
- Volume :
- 14
- Issue :
- 7
- Database :
- Complementary Index
- Journal :
- Applied Sciences (2076-3417)
- Publication Type :
- Academic Journal
- Accession number :
- 176596976
- Full Text :
- https://doi.org/10.3390/app14072759