1. Fedpower: privacy-preserving distributed eigenspace estimation.
- Author
-
Guo, Xiao, Li, Xiang, Chang, Xiangyu, Wang, Shusen, and Zhang, Zhihua
- Subjects
FEDERATED learning ,DATA privacy ,ARTIFICIAL intelligence ,RANDOM noise theory ,MACHINE learning - Abstract
Eigenspace estimation is a fundamental tool in data analytics, which has found applications in PCA, dimension reduction, and clustering, among others. The modern machine learning community usually involves data that come from and belong to different organizations. The low communication power and possible data privacy breaches make the eigenspace estimation challenging. To address these issues, we propose a class of algorithms called FedPower within the federated learning (FL) framework. FedPower leverages the well-known power method by alternating multiple local power iterations and a global aggregation step, thus improving communication efficiency. In the aggregation, we propose to weight each local eigenvector matrix with Orthogonal Procrustes Transformation (OPT) for better alignment. We add Gaussian noise in each iteration to ensure strong privacy protection by adopting the notion of differential privacy (DP). We provide convergence bounds for FedPower composed of different interpretable terms corresponding to the effects of Gaussian noise, parallelization, and random sampling of local machines. Additionally, we conduct experiments to demonstrate the effectiveness of our proposed algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF