1. FedDBO: A Novel Federated Learning Approach for Communication Cost and Data Heterogeneity Using Dung Beetle Optimizer
- Author
-
Dongyan Wang, Limin Chen, Xiaotong Lu, Yidi Wang, Yue Shen, and Jingjing Xu
- Subjects
Federated learning ,dung beetle optimizer ,model scores ,data heterogeneous ,communication cost ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
As an emerging distributed machine learning technology, federated learning has gained widespread attention due to its critical privacy protection mechanism. However, it also faces challenges such as high communication costs and heterogeneous client data.In order to address the above issues. This paper proposes a federated learning approach based on the dung beetle optimizer, named FedDBO. In this method, the model parameters uploaded from clients to the server are transformed into model scores. In each round of training, only some of the clients with high model scores need to be selected to upload their parameters to the server, thus reducing communication costs; Simultaneously, a model retraining strategy is introduced. After aggregating the model parameters sent by clients, the server performs a second iterative training on the aggregated model using its own metadata, thereby reducing data heterogeneity and improving model performance. In addition, a proof of convergence is provided, demonstrating that the model aggregated by FedDBO converges to the aggregated model of FedAvg after each training round. Finally, experiments indicate that when simulating various data heterogeneous environments on datasets, FedDBO exhibits higher accuracy and better stability compared to three other algorithms: FedAvg, FedShare, and FedPSO.
- Published
- 2024
- Full Text
- View/download PDF