Back to Search Start Over

Exploring Selective Layer Fine-Tuning in Federated Learning

Authors :
Sun, Yuchang
Xie, Yuexiang
Ding, Bolin
Li, Yaliang
Zhang, Jun
Publication Year :
2024

Abstract

Federated learning (FL) has emerged as a promising paradigm for fine-tuning foundation models using distributed data in a privacy-preserving manner. Under limited computational resources, clients often find it more practical to fine-tune a selected subset of layers, rather than the entire model, based on their task-specific data. In this study, we provide a thorough theoretical exploration of selective layer fine-tuning in FL, emphasizing a flexible approach that allows the clients to adjust their selected layers according to their local data and resources. We theoretically demonstrate that the layer selection strategy has a significant impact on model convergence in two critical aspects: the importance of selected layers and the heterogeneous choices across clients. Drawing from these insights, we further propose a strategic layer selection method that utilizes local gradients and regulates layer selections across clients. The extensive experiments on both image and text datasets demonstrate the effectiveness of the proposed strategy compared with several baselines, highlighting its advances in identifying critical layers that adapt to the client heterogeneity and training dynamics in FL.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.15600
Document Type :
Working Paper