Back to Search Start Over

Two‐stage personalized federated learning based on sparse pretraining

Authors :
Tong Liu
Kaixuan Xie
Yi Kong
Guojun Chen
Yinfei Xu
Lun Xin
Fei Yu
Source :
Electronics Letters, Vol 59, Iss 17, Pp n/a-n/a (2023)
Publication Year :
2023
Publisher :
Wiley, 2023.

Abstract

Abstract Aiming at solving the performance degradation of federated learning (FL) under heterogeneous data distribution, personalized FL (PFL) was proposed. It is designed to produce a dedicated model for each client. However, the existing PFL solution only focuses on the performance of personalized model, ignoring the performance of global model, which will affect the willingness of new clients to participate. In order to solve this problem, this paper proposes a new PFL solution, a two‐stage PFL based on sparse pretraining, which can not only train a sparse personalized model for each client, but also obtain a sparse global model. The whole training process is divided into sparse pretraining and sparse personalized training, which focus on the performance of global model and personalized model respectively. Also, we propose a mask sparse aggregation technique to maintain the sparsity of the global model in the sparse personalized training stage. Experimental results show that compared with existing algorithms, our proposed algorithm can improve the accuracy of the global model while maintaining advanced personalized model accuracy, and has higher communication efficiency.

Details

Language :
English
ISSN :
1350911X and 00135194
Volume :
59
Issue :
17
Database :
Directory of Open Access Journals
Journal :
Electronics Letters
Publication Type :
Academic Journal
Accession number :
edsdoj.65b43f61d2134ca08f8a3b4d589e6d2b
Document Type :
article
Full Text :
https://doi.org/10.1049/ell2.12943