Back to Search Start Over

Snake Learning: A Communication- and Computation-Efficient Distributed Learning Framework for 6G

Authors :
Yu, Xiaoxue
Yi, Xingfu
Li, Rongpeng
Wang, Fei
Peng, Chenghui
Zhao, Zhifeng
Zhang, Honggang
Publication Year :
2024

Abstract

In the evolution towards 6G, integrating Artificial Intelligence (AI) with advanced network infrastructure emerges as a pivotal strategy for enhancing network intelligence and resource utilization. Existing distributed learning frameworks like Federated Learning and Split Learning often struggle with significant challenges in dynamic network environments including high synchronization demands, costly communication overheads, severe computing resource consumption, and data heterogeneity across network nodes. These obstacles hinder the applications of ubiquitous computing capabilities of 6G networks, especially in light of the trend of escalating model parameters and training data volumes. To address these challenges effectively, this paper introduces "Snake Learning", a cost-effective distributed learning framework. Specifically, Snake Learning respects the heterogeneity of inter-node computing capability and local data distribution in 6G networks, and sequentially trains the designated part of model layers on individual nodes. This layer-by-layer serpentine update mechanism contributes to significantly reducing the requirements for storage, memory and communication during the model training phase, and demonstrates superior adaptability and efficiency for both Computer Vision (CV) training and Large Language Model (LLM) fine-tuning tasks across homogeneous and heterogeneous data distributions.<br />Comment: 7 pages, 6 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.03372
Document Type :
Working Paper