Back to Search Start Over

Federated Learning for Inference at Anytime and Anywhere

Authors :
Liu, Zicheng
Li, Da
Fernandez-Marques, Javier
Laskaridis, Stefanos
Gao, Yan
Dudziak, Łukasz
Li, Stan Z.
Hu, Shell Xu
Hospedales, Timothy
Publication Year :
2022

Abstract

Federated learning has been predominantly concerned with collaborative training of deep networks from scratch, and especially the many challenges that arise, such as communication cost, robustness to heterogeneous data, and support for diverse device capabilities. However, there is no unified framework that addresses all these problems together. This paper studies the challenges and opportunities of exploiting pre-trained Transformer models in FL. In particular, we propose to efficiently adapt such pre-trained models by injecting a novel attention-based adapter module at each transformer block that both modulates the forward pass and makes an early prediction. Training only the lightweight adapter by FL leads to fast and communication-efficient learning even in the presence of heterogeneous data and devices. Extensive experiments on standard FL benchmarks, including CIFAR-100, FEMNIST and SpeechCommandsv2 demonstrate that this simple framework provides fast and accurate FL while supporting heterogenous device capabilities, efficient personalization, and scalable-cost anytime inference.<br />14 pages, 3 figures

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....335916a15e04149d23e763dc718475b7