Back to Search Start Over

FOOGD: Federated Collaboration for Both Out-of-distribution Generalization and Detection

Authors :
Liao, Xinting
Liu, Weiming
Zhou, Pengyang
Yu, Fengyuan
Xu, Jiahe
Wang, Jun
Wang, Wenjie
Chen, Chaochao
Zheng, Xiaolin
Publication Year :
2024

Abstract

Federated learning (FL) is a promising machine learning paradigm that collaborates with client models to capture global knowledge. However, deploying FL models in real-world scenarios remains unreliable due to the coexistence of in-distribution data and unexpected out-of-distribution (OOD) data, such as covariate-shift and semantic-shift data. Current FL researches typically address either covariate-shift data through OOD generalization or semantic-shift data via OOD detection, overlooking the simultaneous occurrence of various OOD shifts. In this work, we propose FOOGD, a method that estimates the probability density of each client and obtains reliable global distribution as guidance for the subsequent FL process. Firstly, SM3D in FOOGD estimates score model for arbitrary distributions without prior constraints, and detects semantic-shift data powerfully. Then SAG in FOOGD provides invariant yet diverse knowledge for both local covariate-shift generalization and client performance generalization. In empirical validations, FOOGD significantly enjoys three main advantages: (1) reliably estimating non-normalized decentralized distributions, (2) detecting semantic shift data via score values, and (3) generalizing to covariate-shift data by regularizing feature extractor. The prejoct is open in https://github.com/XeniaLLL/FOOGD-main.git.<br />Comment: NeurIPS 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.11397
Document Type :
Working Paper