Back to Search Start Over

FedBAT: Communication-Efficient Federated Learning via Learnable Binarization

Authors :
Li, Shiwei
Xu, Wenchao
Wang, Haozhao
Tang, Xing
Qi, Yining
Xu, Shijie
Luo, Weihong
Li, Yuhua
He, Xiuqiang
Li, Ruixuan
Publication Year :
2024

Abstract

Federated learning is a promising distributed machine learning paradigm that can effectively exploit large-scale data without exposing users' privacy. However, it may incur significant communication overhead, thereby potentially impairing the training efficiency. To address this challenge, numerous studies suggest binarizing the model updates. Nonetheless, traditional methods usually binarize model updates in a post-training manner, resulting in significant approximation errors and consequent degradation in model accuracy. To this end, we propose Federated Binarization-Aware Training (FedBAT), a novel framework that directly learns binary model updates during the local training process, thus inherently reducing the approximation errors. FedBAT incorporates an innovative binarization operator, along with meticulously designed derivatives to facilitate efficient learning. In addition, we establish theoretical guarantees regarding the convergence of FedBAT. Extensive experiments are conducted on four popular datasets. The results show that FedBAT significantly accelerates the convergence and exceeds the accuracy of baselines by up to 9\%, even surpassing that of FedAvg in some cases.<br />Comment: Accepted by ICML 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.03215
Document Type :
Working Paper