Back to Search Start Over

CBFL: A Communication-Efficient Federated Learning Framework From Data Redundancy Perspective

Authors :
Kaiju Li
Chunhua Xiao
Source :
IEEE Systems Journal. 16:5572-5583
Publication Year :
2022
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2022.

Abstract

Federated learning (FL) is an emerging machinelearning framework, which enables multiple mobile users to collaboratively train a global model without uploading their local sensitive data. Due to limited network bandwidth, communication efficiency has become a significant bottleneck for the implementation of FL. Existing works attempt to improve this situation by reducing the total bits transferred for each client update via data compression. However, these research works are only from the perspective of update parameters to reduce the transmission of redundant parameters. They do not explore the intrinsic reasons for redundant parameters. In this article, we propose a coreset-based FL (CBFL) framework. Instead of training model on full datasets with a regular network model, CBFL uses a much smaller well-matched evolutionary network model on coreset. CBFL indirectly reduces the total transmission bits for each client while achieving a similar accuracy as training with full datasets. CBFL includes a novel distributed coreset construction and adaptive model evolution algorithms. The network model is adaptively adjusted during this process, which dynamically removes the least important connections from the current model. Experimental results with various datasets and models show that CBFL is able to find an optimized evolutionary model that has about 10% of the total number of connections in the original regular model while only about 2% degradation in model accuracy.

Details

ISSN :
23737816 and 19328184
Volume :
16
Database :
OpenAIRE
Journal :
IEEE Systems Journal
Accession number :
edsair.doi...........988ce2b8769ba1002dafe12fd3168c1c
Full Text :
https://doi.org/10.1109/jsyst.2021.3119152