1. Fair Federated Learning with Multi-Objective Hyperparameter Optimization.
- Author
-
Wang, Chunnan, Shi, Xiangyu, and Wang, Hongzhi
- Subjects
FEDERATED learning ,PARETO optimum ,MACHINE learning ,GLOBAL method of teaching ,FAIRNESS - Abstract
Federated learning (FL) is an attractive paradigm for privacy-aware distributed machine learning, which enables clients to collaboratively learn a global model without sharing clients' data. Recently, many strategies have been proposed to improve the generality of the global model and thus improve FL effect. However, existing strategies either ignore the fairness among clients or sacrifice performance for fairness. They cannot ensure that the gap among clients is as small as possible without sacrificing federated performance. To address this issue, we propose ParetoFed, a new local information aggregation method dedicated to obtaining better federated performance with smaller gap among clients. Specifically, we propose to use multi-objective hyperparameter optimization (HPO) algorithm to gain global models that are both fair and effective. Then, we send Pareto Optimal global models to each client, allowing them to choose the most suitable one as the base to optimize their local model. ParetoFed not only make the global models more fair but also make the selection of local models more personalized, which can further improve the federated performance. Extensive experiments show that ParetoFed outperforms existing FL methods in terms of fairness, and even achieves better federated performance, which demonstrates the significance of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF