Back to Search Start Over

FedSim: Similarity guided model aggregation for Federated Learning.

Authors :
Palihawadana, Chamath
Wiratunga, Nirmalie
Wijekoon, Anjana
Kalutarage, Harsha
Source :
Neurocomputing. Apr2022, Vol. 483, p432-445. 14p.
Publication Year :
2022

Abstract

Federated Learning (FL) is a distributed machine learning approach in which clients contribute to learning a global model in a privacy preserved manner. Effective aggregation of client models is essential to create a generalised global model. To what extent a client is generalisable and contributing to this aggregation can be ascertained by analysing inter-client relationships. We use similarity between clients to model such relationships. We explore how similarity knowledge can be inferred from comparing client gradients, instead of inferring similarity on the basis of client data which violates the privacy-preserving constraint in FL. The similarity-guided FedSim algorithm, introduced in this paper, decomposes FL aggregation into local and global steps. Clients with similar gradients are clustered to provide local aggregations, which thereafter can be globally aggregated to ensure better coverage whilst reducing variance. Our comparative study also investigates the applicability of FedSim in both real-world datasets and on synthetic datasets where statistical heterogeneity can be controlled and studied systematically. A comparative study of FedSim with state-of-the-art FL baselines, FedAvg and FedProx, clearly shows significant performance gains. Our findings confirm that by exploiting latent inter-client similarities, FedSim's performance is significantly better and more stable compared to both these baselines. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
483
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
155655266
Full Text :
https://doi.org/10.1016/j.neucom.2021.08.141