Back to Search Start Over

GlueFL : reconciling client sampling and model masking for bandwidth efficient federated learning

Authors :
He, Shiqi
Publication Year :
2023
Publisher :
University of British Columbia, 2023.

Abstract

Federated learning (FL) is an effective technique to directly involve edge devices in machine learning (ML) training while preserving client privacy. However, the substantial communication overhead of FL makes training challenging when edge devices have limited network bandwidth. Existing work to optimize FL bandwidth overlooks downstream transmission and does not account for FL client sampling. We propose GlueFL, a framework that incorporates new client sampling and model compression algorithms to mitigate low download bandwidths of FL clients. GlueFL prioritizes recently used clients and bounds the number of changed positions in compression masks in each round. We analyse FL convergence under GlueFL’s sticky sampling, and show that our proposed weighted aggregation preserves unbiasedness of updates and convergence. We evaluate GlueFL empirically, and demonstrate downstream bandwidth and training time savings on three public datasets. On average, our evaluation shows that GlueFL spends 29% less training time with a 27% less downstream bandwidth overhead as compared to three state-of-the-art strategies.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi...........516d26182a39cfbd6d00558d7ee7e7b4
Full Text :
https://doi.org/10.14288/1.0423118