1. When Computing Power Network Meets Distributed Machine Learning: An Efficient Federated Split Learning Framework
- Author
-
Yuan, Xinjing, Pu, Lingjun, Jiao, Lei, Wang, Xiaofei, Yang, Meijuan, and Xu, Jingdong
- Subjects
Computer Science - Networking and Internet Architecture ,Computer Science - Machine Learning - Abstract
In this paper, we advocate CPN-FedSL, a novel and flexible Federated Split Learning (FedSL) framework over Computing Power Network (CPN). We build a dedicated model to capture the basic settings and learning characteristics (e.g., training flow, latency and convergence). Based on this model, we introduce Resource Usage Effectiveness (RUE), a novel performance metric integrating training utility with system cost, and formulate a multivariate scheduling problem that maxi?mizes RUE by comprehensively taking client admission, model partition, server selection, routing and bandwidth allocation into account (i.e., mixed-integer fractional programming). We design Refinery, an efficient approach that first linearizes the fractional objective and non-convex constraints, and then solves the transformed problem via a greedy based rounding algorithm in multiple iterations. Extensive evaluations corroborate that CPN-FedSL is superior to the standard and state-of-the-art learning frameworks (e.g., FedAvg and SplitFed), and besides Refinery is lightweight and significantly outperforms its variants and de facto heuristic methods under a variety of settings., Comment: 10 pages, 8figures, accepted by IEEE/ACM IWQoS 2023
- Published
- 2023