Back to Search Start Over

Federated Learning Communication-Efficiency Framework via Corset Construction.

Authors :
Li, Kaiju
Wang, Hao
Source :
Computer Journal. Sep2023, Vol. 66 Issue 9, p2077-2101. 25p.
Publication Year :
2023

Abstract

Federated learning (FL) can learn a shared global model across multiple client devices without breaching privacy requirements. But an essential challenge is that devices in FL usually have limited network bandwidth, resulting in inefficient communication as an important bottleneck for FL implementation. Current studies try to overcome this shortcoming by compressing the number of model update bits uploaded by every client. But they did not explore the underlying reason why redundant parameters are generated. In this paper, we propose Corset-Based Federated Learning framework (CBFL) —a novel FL communication framework from the perspective of redundancy data. Instead of training full datasets on a regular network model, CBFL trains a much smaller evolution network model on extracted corset , which intrinsically reduces the overall transmission bits and obtains efficient computation while maintaining a desirable model accuracy. In CBFL, a novel Fedcorset construction algorithm at selected clients and a further distributed model evolution scheme to fit the constructed corset are included. The training model size is dynamically adapted to the corset, either removing a fraction of unimportant or adding important connections at each communication iteration. Experimental results show that CBFL transfers about 13% of communication bits and saves around 56% computing time while having only 2% destination in model accuracy. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*GLOBAL method of teaching

Details

Language :
English
ISSN :
00104620
Volume :
66
Issue :
9
Database :
Academic Search Index
Journal :
Computer Journal
Publication Type :
Academic Journal
Accession number :
172001774
Full Text :
https://doi.org/10.1093/comjnl/bxac062