Back to Search Start Over

Federated learning improves site performance in multicenter deep learning without data sharing

Authors :
Leonard S. Marks
William Speier
Baris Turkbey
Bradford J. Wood
Steven S. Raman
Thomas Sanford
Rushikesh Kulkarni
Alex G. Raman
Jesse Tetreault
Alan Priester
Mona Flores
Holger R. Roth
Daguang Xu
Ziyue Xu
Peter L. Choyke
Corey W. Arnold
Karthik V. Sarma
Stephanie Harmon
Dieter R. Enzmann
Source :
Journal of the American Medical Informatics Association : JAMIA, vol 28, iss 6, Journal of the American Medical Informatics Association : JAMIA
Publication Year :
2021
Publisher :
Oxford University Press (OUP), 2021.

Abstract

Objective To demonstrate enabling multi-institutional training without centralizing or sharing the underlying physical data via federated learning (FL). Materials and Methods Deep learning models were trained at each participating institution using local clinical data, and an additional model was trained using FL across all of the institutions. Results We found that the FL model exhibited superior performance and generalizability to the models trained at single institutions, with an overall performance level that was significantly better than that of any of the institutional models alone when evaluated on held-out test sets from each institution and an outside challenge dataset. Discussion The power of FL was successfully demonstrated across 3 academic institutions while avoiding the privacy risk associated with the transfer and pooling of patient data. Conclusion Federated learning is an effective methodology that merits further study to enable accelerated development of models across institutions, enabling greater generalizability in clinical use.

Details

ISSN :
1527974X
Volume :
28
Database :
OpenAIRE
Journal :
Journal of the American Medical Informatics Association
Accession number :
edsair.doi.dedup.....cfcc43846600977c2b828330a2b4c1eb
Full Text :
https://doi.org/10.1093/jamia/ocaa341