Back to Search
Start Over
Distributed Model Training Based on Data Parallelism in Edge Computing-Enabled Elastic Optical Networks
- Source :
- IEEE Communications Letters. 25:1241-1244
- Publication Year :
- 2021
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2021.
-
Abstract
- The emergence of edge computing provides an effective solution to execute distributed model training (DMT). The deployment of training data among edge nodes affects the training efficiency and network resource usage. This letter aims for the efficient provisioning of DMT services by optimizing the partition and distribution of training data in edge computing-enabled optical networks. An integer linear programming (ILP) model and a data parallelism deployment algorithm (DPDA) are proposed to solve this problem. The performance of the proposed approaches is evaluated through simulation. Simulation results show that the proposed algorithm can deploy more DMT services compared with benchmark.
- Subjects :
- Data parallelism
Computer science
Distributed computing
020206 networking & telecommunications
Provisioning
02 engineering and technology
Partition (database)
Computer Science Applications
Data modeling
Modeling and Simulation
0202 electrical engineering, electronic engineering, information engineering
Benchmark (computing)
Enhanced Data Rates for GSM Evolution
Electrical and Electronic Engineering
Integer programming
Edge computing
Subjects
Details
- ISSN :
- 23737891 and 10897798
- Volume :
- 25
- Database :
- OpenAIRE
- Journal :
- IEEE Communications Letters
- Accession number :
- edsair.doi...........8c8497ba962656e2ed90f6032db43c7d
- Full Text :
- https://doi.org/10.1109/lcomm.2020.3041453