Back to Search Start Over

An Element-Wise Weights Aggregation Method for Federated Learning

Authors :
Hu, Yi
Ren, Hanchi
Hu, Chen
Deng, Jingjing
Xie, Xianghua
Publication Year :
2024

Abstract

Federated learning (FL) is a powerful Machine Learning (ML) paradigm that enables distributed clients to collaboratively learn a shared global model while keeping the data on the original device, thereby preserving privacy. A central challenge in FL is the effective aggregation of local model weights from disparate and potentially unbalanced participating clients. Existing methods often treat each client indiscriminately, applying a single proportion to the entire local model. However, it is empirically advantageous for each weight to be assigned a specific proportion. This paper introduces an innovative Element-Wise Weights Aggregation Method for Federated Learning (EWWA-FL) aimed at optimizing learning performance and accelerating convergence speed. Unlike traditional FL approaches, EWWA-FL aggregates local weights to the global model at the level of individual elements, thereby allowing each participating client to make element-wise contributions to the learning process. By taking into account the unique dataset characteristics of each client, EWWA-FL enhances the robustness of the global model to different datasets while also achieving rapid convergence. The method is flexible enough to employ various weighting strategies. Through comprehensive experiments, we demonstrate the advanced capabilities of EWWA-FL, showing significant improvements in both accuracy and convergence speed across a range of backbones and benchmarks.<br />Comment: 2023 IEEE International Conference on Data Mining Workshops (ICDMW)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.15919
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/ICDMW60847.2023.00031