Back to Search
Start Over
Federated learning via reweighting information bottleneck with domain generalization.
- Source :
-
Information Sciences . Aug2024, Vol. 677, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Federated learning (FL) plays an important role in collaborative distributed modeling. However, most studies cannot address poor generalization of out-of-distribution (OoD) data. Efforts have been exerted to address data heterogeneity among participants, but yielding limited success. Here, we propose an information bottleneck based FL method (FedIB), which aims to build a model with better OoD generalization. We extract the domain-invariance of different source domains to mitigate the domain heterogeneity under the cross-silo scenarios. Next, given the scale imbalance, we balance the representation importance of different domains with reweighting a better invariance across multiple domains. In addition, the convergence of FedIB is analyzed. As opposed to aligning distributions or eliminating redundancy by previous methods, FedIB achieves better domain generalization explicitly by eliminating the pseudo-invariant features. Finally, we conduct extensive experiments on various datasets revealing that FedIB has superior performances facing OoD and scale imbalance scenarios in distributed modeling. • We extract domain-invariant representation via information bottleneck across different clients. • Pseudo-invariant features of different clients are eliminated in the form of mutual information. • We propose the reweighting technique to balance disturbances from data scales for a better approximation of domain invariance. invariance. [ABSTRACT FROM AUTHOR]
- Subjects :
- *FEDERATED learning
*GENERALIZATION
Subjects
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 677
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 177926246
- Full Text :
- https://doi.org/10.1016/j.ins.2024.120825