1. Communication-Efficient Randomized Algorithm for Multi-Kernel Online Federated Learning.
- Author
-
Hong, Songnam and Chae, Jeongmin
- Subjects
- *
KERNEL functions , *ONLINE algorithms - Abstract
Online federated learning (OFL) is a promising framework to learn a sequence of global functions from distributed sequential data at local devices. In this framework, we first introduce a single kernel-based OFL (termed S-KOFL) by incorporating random-feature (RF) approximation, online gradient descent (OGD), and federated averaging (FedAvg). As manifested in the centralized counterpart, an extension to multi-kernel method is necessary. Harnessing the extension principle in the centralized method, we construct a vanilla multi-kernel algorithm (termed vM-KOFL) and prove its asymptotic optimality. However, it is not practical as the communication overhead grows linearly with the size of a kernel dictionary. Moreover, this problem cannot be addressed via the existing communication-efficient techniques (e.g., quantization and sparsification) in the conventional federated learning. Our major contribution is to propose a novel randomized algorithm (named eM-KOFL), which exhibits similar performance to vM-KOFL while maintaining low communication cost. We theoretically prove that eM-KOFL achieves an optimal sublinear regret bound. Mimicking the key concept of eM-KOFL in an efficient way, we propose a more practical pM-KOFL having the same communication overhead as S-KOFL. Via numerical tests with real datasets, we demonstrate that pM-KOFL yields the almost same performance as vM-KOFL (or eM-KOFL) on various online learning tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF