Back to Search Start Over

Distributed Model Training Based on Data Parallelism in Edge Computing-Enabled Elastic Optical Networks

Authors :
Jun Li
Boyuan Yan
Yajie Li
Jie Zhang
Yongli Zhao
Zebin Zeng
Source :
IEEE Communications Letters. 25:1241-1244
Publication Year :
2021
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2021.

Abstract

The emergence of edge computing provides an effective solution to execute distributed model training (DMT). The deployment of training data among edge nodes affects the training efficiency and network resource usage. This letter aims for the efficient provisioning of DMT services by optimizing the partition and distribution of training data in edge computing-enabled optical networks. An integer linear programming (ILP) model and a data parallelism deployment algorithm (DPDA) are proposed to solve this problem. The performance of the proposed approaches is evaluated through simulation. Simulation results show that the proposed algorithm can deploy more DMT services compared with benchmark.

Details

ISSN :
23737891 and 10897798
Volume :
25
Database :
OpenAIRE
Journal :
IEEE Communications Letters
Accession number :
edsair.doi...........8c8497ba962656e2ed90f6032db43c7d
Full Text :
https://doi.org/10.1109/lcomm.2020.3041453