Back to Search
Start Over
Dynamic Auxiliary Soft Labels for decoupled learning.
- Source :
-
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2022 Jul; Vol. 151, pp. 132-142. Date of Electronic Publication: 2022 Mar 31. - Publication Year :
- 2022
-
Abstract
- The long-tailed distribution in the dataset is one of the major challenges of deep learning. Convolutional Neural Networks have poor performance in identifying classes with only a few samples. For this problem, it has been proved that separating the feature learning stage and the classifier learning stage improves the performance of models effectively, which is called decoupled learning. We use soft labels to improve the performance of the decoupled learning framework by proposing a Dynamic Auxiliary Soft Labels (DaSL) method. Specifically, we design a dedicated auxiliary network to generate auxiliary soft labels for the two different training stages. In the feature learning stage, it helps to learn features with smaller variance within the class, and in the classifier learning stage it helps to alleviate the overconfidence of the model prediction. We also introduce a feature-level distillation method for the feature learning, and improve the learning of general features through multi-scale feature fusion. We conduct extensive experiments on three long-tailed recognition benchmark datasets to demonstrate the effectiveness of our DaSL.<br />Competing Interests: Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.<br /> (Copyright © 2022 Elsevier Ltd. All rights reserved.)
- Subjects :
- Benchmarking
Neural Networks, Computer
Subjects
Details
- Language :
- English
- ISSN :
- 1879-2782
- Volume :
- 151
- Database :
- MEDLINE
- Journal :
- Neural networks : the official journal of the International Neural Network Society
- Publication Type :
- Academic Journal
- Accession number :
- 35421708
- Full Text :
- https://doi.org/10.1016/j.neunet.2022.03.027