1. Efficient Online Scheduling for Coflow-Aware Machine Learning Clusters
- Author
-
Renhai Xu, Sheng Chen, Heng Qi, Song Zhang, Wenxin Li, and Keqiu Li
- Subjects
Computer Networks and Communications ,Computer science ,business.industry ,Workload ,Cloud computing ,Machine learning ,computer.software_genre ,Computer Science Applications ,Scheduling (computing) ,Hardware and Architecture ,Cluster (physics) ,Artificial intelligence ,Completion time ,business ,computer ,Software ,Information Systems - Abstract
Distributed machine learning (DML) is an increasingly important workload. In a DML job, each communication stage can comprise a coflow, and there are dependencies among its coflows. Thus, efficient coflow scheduling becomes critical for DML jobs. However, the majority of existing solutions focus on scheduling single-stage coflows with no dependencies. While there are a few studies schedule dependent coflows of multi-stage jobs, they suffer from either practical or theoretical issues. In this paper, we study how to schedule dependent coflows of multiple DML jobs to minimize the total job completion time (JCT) in a shared cluster. To solve this problem without any prior knowledge of job information, we present an online coflow-aware optimization framework called Parrot. The core idea in Parrot is to infer the job with the shortest remaining processing time (SRPT) each time and dynamically control the inferred job's bandwidth based on how confident it is an SRPT job while being mindful of not starving any other job. We have proved that Parrot algorithm has an approximation ratio of $O(M)$ , where M is the number of jobs. The results from large-scale trace-driven simulations further demonstrate that our Parrot can reduce the total JCT by up to 58.4%, compared to the state-of-the-art solution Aalo.
- Published
- 2022
- Full Text
- View/download PDF