1. FedFixer: Mitigating Heterogeneous Label Noise in Federated Learning
- Author
-
Ji, Xinyuan, Zhu, Zhaowei, Xi, Wei, Gadyatskaya, Olga, Song, Zilong, Cai, Yong, and Liu, Yang
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence - Abstract
Federated Learning (FL) heavily depends on label quality for its performance. However, the label distribution among individual clients is always both noisy and heterogeneous. The high loss incurred by client-specific samples in heterogeneous label noise poses challenges for distinguishing between client-specific and noisy label samples, impacting the effectiveness of existing label noise learning approaches. To tackle this issue, we propose FedFixer, where the personalized model is introduced to cooperate with the global model to effectively select clean client-specific samples. In the dual models, updating the personalized model solely at a local level can lead to overfitting on noisy data due to limited samples, consequently affecting both the local and global models' performance. To mitigate overfitting, we address this concern from two perspectives. Firstly, we employ a confidence regularizer to alleviate the impact of unconfident predictions caused by label noise. Secondly, a distance regularizer is implemented to constrain the disparity between the personalized and global models. We validate the effectiveness of FedFixer through extensive experiments on benchmark datasets. The results demonstrate that FedFixer can perform well in filtering noisy label samples on different clients, especially in highly heterogeneous label noise scenarios., Comment: accepted by AAA24
- Published
- 2024