Back to Search
Start Over
Data-Free Distillation Improves Efficiency and Privacy in Federated Thorax Disease Analysis
- Publication Year :
- 2023
-
Abstract
- Thorax disease analysis in large-scale, multi-centre, and multi-scanner settings is often limited by strict privacy policies. Federated learning (FL) offers a potential solution, while traditional parameter-based FL can be limited by issues such as high communication costs, data leakage, and heterogeneity. Distillation-based FL can improve efficiency, but it relies on a proxy dataset, which is often impractical in clinical practice. To address these challenges, we introduce a data-free distillation-based FL approach FedKDF. In FedKDF, the server employs a lightweight generator to aggregate knowledge from different clients without requiring access to their private data or a proxy dataset. FedKDF combines the predictors from clients into a single, unified predictor, which is further optimized using the learned knowledge in the lightweight generator. Our empirical experiments demonstrate that FedKDF offers a robust solution for efficient, privacy-preserving federated thorax disease analysis.<br />Comment: Accepted by the IEEE EMBS International Conference on Data Science and Engineering in Healthcare, Medicine & Biology
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2310.18346
- Document Type :
- Working Paper