Back to Search
Start Over
Hierarchical federated learning with global differential privacy.
- Source :
-
Electronic Research Archive . 2023, Vol. 31 Issue 7, p1-18. 18p. - Publication Year :
- 2023
-
Abstract
- Federated learning (FL) is a framework which is used in distributed machine learning to obtain an optimal model from clients' local updates. As an efficient design in model convergence and data communication, cloud-edge-client hierarchical federated learning (HFL) attracts more attention than the typical cloud-client architecture. However, the HFL still poses threats to clients' sensitive data by analyzing the upload and download parameters. In this paper, to address information leakage effectively, we propose a novel privacy-preserving scheme based on the concept of differential privacy (DP), adding Gaussian noises to the shared parameters when uploading them to edge and cloud servers and broadcasting them to clients. Our algorithm can obtain global differential privacy with adjustable noises in the architecture. We evaluate the performance on image classification tasks. In our experiment on the Modified National Institute of Standards and Technology (MNIST) dataset, we get 91% model accuracy-layer HFL-DP, our design is more secure while as being accurate. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 26881594
- Volume :
- 31
- Issue :
- 7
- Database :
- Academic Search Index
- Journal :
- Electronic Research Archive
- Publication Type :
- Academic Journal
- Accession number :
- 178362303
- Full Text :
- https://doi.org/10.3934/era.2023190