1. How to Privately Tune Hyperparameters in Federated Learning? Insights from a Benchmark Study
- Author
-
Mitic, Natalija, Pyrgelis, Apostolos, and Sav, Sinem
- Subjects
Computer Science - Cryptography and Security - Abstract
In this paper, we address the problem of privacy-preserving hyperparameter (HP) tuning for cross-silo federated learning (FL). We first perform a comprehensive measurement study that benchmarks various HP strategies suitable for FL. Our benchmarks show that the optimal parameters of the FL server, e.g., the learning rate, can be accurately and efficiently tuned based on the HPs found by each client on its local data. We demonstrate that HP averaging is suitable for iid settings, while density-based clustering can uncover the optimal set of parameters in non-iid ones. Then, to prevent information leakage from the exchange of the clients' local HPs, we design and implement PrivTuna, a novel framework for privacy-preserving HP tuning using multiparty homomorphic encryption. We use PrivTuna to implement privacy-preserving federated averaging and density-based clustering, and we experimentally evaluate its performance demonstrating its computation/communication efficiency and its precision in tuning hyperparameters.
- Published
- 2024