Back to Search Start Over

Fed-GraB: Federated Long-tailed Learning with Self-Adjusting Gradient Balancer

Authors :
Xiao, Zikai
Chen, Zihan
Liu, Songshang
Wang, Hualiang
Feng, Yang
Hao, Jin
Zhou, Joey Tianyi
Wu, Jian
Yang, Howard Hao
Liu, Zuozhu
Xiao, Zikai
Chen, Zihan
Liu, Songshang
Wang, Hualiang
Feng, Yang
Hao, Jin
Zhou, Joey Tianyi
Wu, Jian
Yang, Howard Hao
Liu, Zuozhu
Publication Year :
2023

Abstract

Data privacy and long-tailed distribution are the norms rather than the exception in many real-world tasks. This paper investigates a federated long-tailed learning (Fed-LT) task in which each client holds a locally heterogeneous dataset; if the datasets can be globally aggregated, they jointly exhibit a long-tailed distribution. Under such a setting, existing federated optimization and/or centralized long-tailed learning methods hardly apply due to challenges in (a) characterizing the global long-tailed distribution under privacy constraints and (b) adjusting the local learning strategy to cope with the head-tail imbalance. In response, we propose a method termed $\texttt{Fed-GraB}$, comprised of a Self-adjusting Gradient Balancer (SGB) module that re-weights clients' gradients in a closed-loop manner, based on the feedback of global long-tailed distribution evaluated by a Direct Prior Analyzer (DPA) module. Using $\texttt{Fed-GraB}$, clients can effectively alleviate the distribution drift caused by data heterogeneity during the model training process and obtain a global model with better performance on the minority classes while maintaining the performance of the majority classes. Extensive experiments demonstrate that $\texttt{Fed-GraB}$ achieves state-of-the-art performance on representative datasets such as CIFAR-10-LT, CIFAR-100-LT, ImageNet-LT, and iNaturalist.<br />Comment: Accepted by NeurIPS 2023

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438489463
Document Type :
Electronic Resource