Back to Search Start Over

Preconditioned Federated Learning

Authors :
Tao, Zeyi
Wu, Jindi
Li, Qun
Publication Year :
2023

Abstract

Federated Learning (FL) is a distributed machine learning approach that enables model training in communication efficient and privacy-preserving manner. The standard optimization method in FL is Federated Averaging (FedAvg), which performs multiple local SGD steps between communication rounds. FedAvg has been considered to lack algorithm adaptivity compared to modern first-order adaptive optimizations. In this paper, we propose new communication-efficient FL algortithms based on two adaptive frameworks: local adaptivity (PreFed) and server-side adaptivity (PreFedOp). Proposed methods adopt adaptivity by using a novel covariance matrix preconditioner. Theoretically, we provide convergence guarantees for our algorithms. The empirical experiments show our methods achieve state-of-the-art performances on both i.i.d. and non-i.i.d. settings.<br />Comment: preprint

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2309.11378
Document Type :
Working Paper