Back to Search Start Over

Approximate Gradient Coding for Privacy-Flexible Federated Learning with Non-IID Data

Authors :
Makkonen, Okko
Niemelä, Sampo
Hollanti, Camilla
Hanna, Serge Kas
Publication Year :
2024

Abstract

This work focuses on the challenges of non-IID data and stragglers/dropouts in federated learning. We introduce and explore a privacy-flexible paradigm that models parts of the clients' local data as non-private, offering a more versatile and business-oriented perspective on privacy. Within this framework, we propose a data-driven strategy for mitigating the effects of label heterogeneity and client straggling on federated learning. Our solution combines both offline data sharing and approximate gradient coding techniques. Through numerical simulations using the MNIST dataset, we demonstrate that our approach enables achieving a deliberate trade-off between privacy and utility, leading to improved model convergence and accuracy while using an adaptable portion of non-private data.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.03524
Document Type :
Working Paper