Back to Search Start Over

COMMUTE: communication-efficient transfer learning for multi-site risk prediction

Authors :
Tian Gu
Phil H. Lee
Rui Duan
Publication Year :
2022
Publisher :
Cold Spring Harbor Laboratory, 2022.

Abstract

ObjectivesWe propose a communication-efficient transfer learning approach (COMMUTE) that efficiently and effectively incorporates multi-site healthcare data for training risk prediction models in a target population of interest, accounting for challenges including population heterogeneity and data sharing constraints across sites.MethodsWe first train population-specific source models locally within each institution. Using data from a given target population, COMMUTE learns a calibration term for each source model, which adjusts for potential data heterogeneity through flexible distance-based regularizations. In a centralized setting where multi-site data can be directly pooled, all data are combined to train the target model after calibration. When individual-level data are not shareable in some sites, COMMUTE requests only the locally trained models from these sites, with which, COMMUTE generates heterogeneity-adjusted synthetic data for training the target model. We evaluate COMMUTE via extensive simulation studies and an application to multi-site data from the electronic Medical Records and Genomics (eMERGE) Network to predict extreme obesity.ResultsSimulation studies show that COMMUTE outperforms methods without adjusting for population heterogeneity and methods trained in a single population over a broad spectrum of settings. Using eMERGE data, COMMUTE achieves an area under the receiver operating characteristic curve (AUC) around 0.80, which outperforms other benchmark methods with AUC ranging from 0.51 to 0.70.ConclusionCOMMUTE improves the risk prediction in the target population and safeguards against negative transfer when some source populations are highly different from the target. In a federated setting, it is highly communication efficient as it only requires each site to share model parameter estimates once, and no iterative communication or higher-order terms are needed.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....a14e97d6ff2bb0ad356977a66b05f3dc