Back to Search Start Over

Scheduling and Communication Schemes for Decentralized Federated Learning

Authors :
Abdelghany, Bahaa-Eldin Ali
Fernández-Vilas, Ana
Fernández-Veiga, Manuel
El-Bendary, Nashwa
Hassan, Ammar M.
Abdelmoez, Walid M.
Abdelghany, Bahaa-Eldin Ali
Fernández-Vilas, Ana
Fernández-Veiga, Manuel
El-Bendary, Nashwa
Hassan, Ammar M.
Abdelmoez, Walid M.
Publication Year :
2023

Abstract

Federated learning (FL) is a distributed machine learning paradigm in which a large number of clients coordinate with a central server to learn a model without sharing their own training data. One central server is not enough, due to problems of connectivity with clients. In this paper, a decentralized federated learning (DFL) model with the stochastic gradient descent (SGD) algorithm has been introduced, as a more scalable approach to improve the learning performance in a network of agents with arbitrary topology. Three scheduling policies for DFL have been proposed for communications between the clients and the parallel servers, and the convergence, accuracy, and loss have been tested in a totally decentralized mplementation of SGD. The experimental results show that the proposed scheduling polices have an impact both on the speed of convergence and in the final global model.<br />Comment: 32nd International Conference on Computer Theory and Applications (ICCTA), Alexandria, Egypt, 2022

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438503453
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1109.ICCTA58027.2022.10206255.