Back to Search Start Over

Heroes: Lightweight Federated Learning with Neural Composition and Adaptive Local Update in Heterogeneous Edge Networks

Authors :
Yan, Jiaming
Liu, Jianchun
Wang, Shilong
Xu, Hongli
Liu, Haifeng
Zhou, Jianhua
Publication Year :
2023

Abstract

Federated Learning (FL) enables distributed clients to collaboratively train models without exposing their private data. However, it is difficult to implement efficient FL due to limited resources. Most existing works compress the transmitted gradients or prune the global model to reduce the resource cost, but leave the compressed or pruned parameters under-optimized, which degrades the training performance. To address this issue, the neural composition technique constructs size-adjustable models by composing low-rank tensors, allowing every parameter in the global model to learn the knowledge from all clients. Nevertheless, some tensors can only be optimized by a small fraction of clients, thus the global model may get insufficient training, leading to a long completion time, especially in heterogeneous edge scenarios. To this end, we enhance the neural composition technique, enabling all parameters to be fully trained. Further, we propose a lightweight FL framework, called Heroes, with enhanced neural composition and adaptive local update. A greedy-based algorithm is designed to adaptively assign the proper tensors and local update frequencies for participating clients according to their heterogeneous capabilities and resource budgets. Extensive experiments demonstrate that Heroes can reduce traffic consumption by about 72.05\% and provide up to 2.97$\times$ speedup compared to the baselines.<br />Comment: 11pages, 9 figures, to be published in INFOCOM 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.01617
Document Type :
Working Paper