Back to Search Start Over

Exploring the Practicality of Federated Learning: A Survey Towards the Communication Perspective

Authors :
Le, Khiem
Luong-Ha, Nhan
Nguyen-Duc, Manh
Le-Phuoc, Danh
Do, Cuong
Wong, Kok-Seng
Publication Year :
2024

Abstract

Federated Learning (FL) is a promising paradigm that offers significant advancements in privacy-preserving, decentralized machine learning by enabling collaborative training of models across distributed devices without centralizing data. However, the practical deployment of FL systems faces a significant bottleneck: the communication overhead caused by frequently exchanging large model updates between numerous devices and a central server. This communication inefficiency can hinder training speed, model performance, and the overall feasibility of real-world FL applications. In this survey, we investigate various strategies and advancements made in communication-efficient FL, highlighting their impact and potential to overcome the communication challenges inherent in FL systems. Specifically, we define measures for communication efficiency, analyze sources of communication inefficiency in FL systems, and provide a taxonomy and comprehensive review of state-of-the-art communication-efficient FL methods. Additionally, we discuss promising future research directions for enhancing the communication efficiency of FL systems. By addressing the communication bottleneck, FL can be effectively applied and enable scalable and practical deployment across diverse applications that require privacy-preserving, decentralized machine learning, such as IoT, healthcare, or finance.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.20431
Document Type :
Working Paper