Back to Search
Start Over
Distributed Gradient Descent Algorithm Robust to an Arbitrary Number of Byzantine Attackers.
- Source :
- IEEE Transactions on Signal Processing; 11/15/2019, Vol. 67 Issue 22, p5850-5864, 15p
- Publication Year :
- 2019
-
Abstract
- Due to the growth of modern dataset size and the desire to harness computing power of multiple machines, there is a recent surge of interest in the design of distributed machine learning algorithms. However, distributed algorithms are sensitive to Byzantine attackers who can send falsified data to prevent the convergence of algorithms or lead the algorithms to converge to value of the attackers’ choice. Some recent work proposed interesting algorithms that can deal with the scenario when up to half of the workers are compromised. In this paper, we propose a novel algorithm that can deal with an arbitrary number of Byzantine attackers. The main idea is to ask the parameter server to randomly select a small clean dataset and compute noisy gradient using this small dataset. This noisy gradient will then be used as a ground truth to filter out information sent by compromised workers. We show that the proposed algorithm converges to the neighborhood of the population minimizer regardless the number of Byzantine attackers. We further provide numerical examples to show that the proposed algorithm can benefit from the presence of good workers and achieve better performance than existing algorithms. [ABSTRACT FROM AUTHOR]
- Subjects :
- RECOMMENDER systems
ALGORITHMS
MACHINE learning
FALSIFICATION of data
Subjects
Details
- Language :
- English
- ISSN :
- 1053587X
- Volume :
- 67
- Issue :
- 22
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Signal Processing
- Publication Type :
- Academic Journal
- Accession number :
- 139809471
- Full Text :
- https://doi.org/10.1109/TSP.2019.2946020