Back to Search Start Over

Can Public Large Language Models Help Private Cross-device Federated Learning?

Authors :
Wang, Boxin
Zhang, Yibo Jacky
Cao, Yuan
Li, Bo
McMahan, H. Brendan
Oh, Sewoong
Xu, Zheng
Zaheer, Manzil
Publication Year :
2023

Abstract

We study (differentially) private federated learning (FL) of language models. The language models in cross-device FL are relatively small, which can be trained with meaningful formal user-level differential privacy (DP) guarantees when massive parallelism in training is enabled by the participation of a moderate size of users. Recently, public data has been used to improve privacy-utility trade-offs for both large and small language models. In this work, we provide a systematic study of using large-scale public data and LLMs to help differentially private training of on-device FL models, and further improve the privacy-utility tradeoff by techniques of distillation. Moreover, we propose a novel distribution matching algorithm with theoretical grounding to sample public data close to private data distribution, which significantly improves the sample efficiency of (pre-)training on public data. The proposed method is efficient and effective for training private models by taking advantage of public data, especially for customized on-device architectures that do not have ready-to-use pre-trained models.<br />Comment: Published at Findings of NAACL 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.12132
Document Type :
Working Paper