Back to Search
Start Over
Toward the Application of Differential Privacy to Data Collaboration
- Source :
- IEEE Access, Vol 12, Pp 63292-63301 (2024)
- Publication Year :
- 2024
- Publisher :
- IEEE, 2024.
-
Abstract
- Federated Learning, a model-sharing method, and Data Collaboration, a non-model-sharing method, are recognized as data analysis methods for distributed data. In Federated Learning, clients send only the parameters of a machine learning model to the central server. In Data Collaboration, clients send data that has undergone irreversibly transformed through dimensionality reduction to the central server. Both methods are designed with privacy concerns, but privacy is not guaranteed. Differential Privacy, a theoretical and quantitative privacy criterion, has been applied to Federated Learning to achieve rigorous privacy preservation. In this paper, we introduce a novel method using PCA (Principal Component Analysis) that finds low-rank approximation of a matrix preserving the variance, aiming to apply Differential Privacy to Data Collaboration. Experimental evaluation using the proposed method show that differentially-private Data Collaboration achieves comparable performance to differentially-private Federated Learning.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 12
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.4905eb1a6dbf4ca9a4769cc0b03f2a2c
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2024.3396146