1. Dynamic training for handling textual label noise.
- Author
-
Cheng, Shaohuan, Chen, Wenyu, Liu, Wanlong, Zhou, Li, Zhao, Honglin, Kong, Weishan, Qu, Hong, and Fu, Mingsheng
- Subjects
ARTIFICIAL neural networks ,NOISE ,MEMORIZATION ,GENERALIZATION ,SPINE - Abstract
Label noise causes deep neural networks to gradually memorize incorrect labels, leading to a decline in generalization. In this paper, based on three observations from learning behavior in textual noise scenarios, we propose a dynamic training method to enhance model robustness and generalization against textual label noise. This method corrects noisy labels by dynamically incorporating the model's predictions. The combination weight of the original labels is a decay function on training time, which relates to the learning dynamics. Additionally, our method introduces r-drop and prior regularization terms to ensure that the single-model backbone generates reliable predictions, thereby obtaining accurate corrected labels. This design removes the stage splitting and data segmentation required by existing SOTA methods and effectively mitigates the adverse impact of erroneous labels without introducing additional dependencies. Experimental results on four text classification datasets demonstrate that dynamic training outperforms strong baselines designed for class-conditional and instance-dependent noises within the common noise range. Our code is available at https://github.com/shaohuancheng/noisy_label_for_exp_decay. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF