Back to Search Start Over

FedEL: Federated ensemble learning for non-iid data

Authors :
Wu, Xing
Pei, Jie
Han, Xian-Hua
Chen, Yen-Wei
Yao, Junfeng
Liu, Yang
Qian, Quan
Guo, Yike
Wu, Xing
Pei, Jie
Han, Xian-Hua
Chen, Yen-Wei
Yao, Junfeng
Liu, Yang
Qian, Quan
Guo, Yike
Publication Year :
2024

Abstract

Federated learning (FL) is a joint training pattern that fully utilizes data information whereas protecting data privacy. A key challenge in FL is statistical heterogeneity, which arises on account of the heterogeneity of local data distributions among clients, leading to inconsistency in local optimization goals and ultimately reducing the performance of globally aggregated models. We propose the Federated Ensemble Learning (FedEL), which makes full use of the heterogeneity of data distribution among clients to train a group of weak learners with diversity to construct a global model, which is a novel solution to the non-independent identical distribution (non-IID) problem. Experiments demonstrate that the proposed FedEL can improve performance in non-IID data scenarios. Even under extreme statistical heterogeneity, the average accuracy of FedEL is 3.54% higher than the state-of-the-art FL method. Moreover, the proposed FedEL reduces model storage and reasoning costs compared with traditional ensemble learning. The proposed FedEL demonstrates good generalization ability in experiments across different datasets, including natural scene image datasets and medical image datasets. © 2023 Elsevier Ltd

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1405235021
Document Type :
Electronic Resource