Back to Search
Start Over
GFL-ALDPA: a gradient compression federated learning framework based on adaptive local differential privacy budget allocation.
- Source :
- Multimedia Tools & Applications; Mar2024, Vol. 83 Issue 9, p26349-26368, 20p
- Publication Year :
- 2024
-
Abstract
- Federated learning(FL) is a popular distributed machine learning framework which can protect users' private data from being exposed to adversaries. However, related work shows that sensitive private information can still be compromised by analyzing parameters uploaded by clients. Applying differential privacy to federated learning has been a popular privacy-preserving way to achieve strict privacy guarantees in recent years. To reduce the impact of noise, this paper proposes to apply local differential privacy(LDP) to federated learning. We propose a gradient compression federated learning framework based on adaptive local differential privacy budget allocation(GFL-ALDPA). We propose a novel adaptive privacy budget allocation scheme based on communication rounds to reduce the loss of privacy budget and the amount of model noise. It can maximize the limited privacy budget and improve the model accuracy by assigning different privacy budgets to different communication rounds during training. Furthermore, we also propose a gradient compression mechanism based on dimension reduction, which can reduce the communication cost, overall noise size, and loss of the total privacy budget of the model simultaneously to ensure accuracy under a specific privacy-preserving guarantee. Finally, this paper presents the experimental evaluation on the MINIST dataset. Theoretical analysis and experiments demonstrate that our framework can achieve a better trade-off between privacy preservation, communication efficiency, and model accuracy. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13807501
- Volume :
- 83
- Issue :
- 9
- Database :
- Complementary Index
- Journal :
- Multimedia Tools & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 175834291
- Full Text :
- https://doi.org/10.1007/s11042-023-16543-y