Back to Search Start Over

FedZKT: Zero-Shot Knowledge Transfer towards Resource-Constrained Federated Learning with Heterogeneous On-Device Models

Authors :
Zhang, Lan
Wu, Dapeng
Yuan, Xiaoyong
Publication Year :
2021

Abstract

Federated learning enables multiple distributed devices to collaboratively learn a shared prediction model without centralizing their on-device data. Most of the current algorithms require comparable individual efforts for local training with the same structure and size of on-device models, which, however, impedes participation from resource-constrained devices. Given the widespread yet heterogeneous devices nowadays, in this paper, we propose an innovative federated learning framework with heterogeneous on-device models through Zero-shot Knowledge Transfer, named by FedZKT. Specifically, FedZKT allows devices to independently determine the on-device models upon their local resources. To achieve knowledge transfer across these heterogeneous on-device models, a zero-shot distillation approach is designed without any prerequisites for private on-device data, which is contrary to certain prior research based on a public dataset or a pre-trained data generator. Moreover, this compute-intensive distillation task is assigned to the server to allow the participation of resource-constrained devices, where a generator is adversarially learned with the ensemble of collected on-device models. The distilled central knowledge is then sent back in the form of the corresponding on-device model parameters, which can be easily absorbed on the device side. Extensive experimental studies demonstrate the effectiveness and robustness of FedZKT towards on-device knowledge agnostic, on-device model heterogeneity, and other challenging federated learning scenarios, such as heterogeneous on-device data and straggler effects.<br />Comment: This paper has been accepted to ICDCS 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.03775
Document Type :
Working Paper