Back to Search Start Over

Predicting sex from retinal fundus photographs using automated deep learning

Authors :
Edward Korot
Nikolas Pontikos
Xiaoxuan Liu
Siegfried K. Wagner
Livia Faes
Josef Huemer
Konstantinos Balaskas
Alastair K. Denniston
Anthony Khawaja
Pearse A. Keane
Source :
Scientific Reports, Vol 11, Iss 1, Pp 1-8 (2021)
Publication Year :
2021
Publisher :
Nature Portfolio, 2021.

Abstract

Abstract Deep learning may transform health care, but model development has largely been dependent on availability of advanced technical expertise. Herein we present the development of a deep learning model by clinicians without coding, which predicts reported sex from retinal fundus photographs. A model was trained on 84,743 retinal fundus photos from the UK Biobank dataset. External validation was performed on 252 fundus photos from a tertiary ophthalmic referral center. For internal validation, the area under the receiver operating characteristic curve (AUROC) of the code free deep learning (CFDL) model was 0.93. Sensitivity, specificity, positive predictive value (PPV) and accuracy (ACC) were 88.8%, 83.6%, 87.3% and 86.5%, and for external validation were 83.9%, 72.2%, 78.2% and 78.6% respectively. Clinicians are currently unaware of distinct retinal feature variations between males and females, highlighting the importance of model explainability for this task. The model performed significantly worse when foveal pathology was present in the external validation dataset, ACC: 69.4%, compared to 85.4% in healthy eyes, suggesting the fovea is a salient region for model performance OR (95% CI): 0.36 (0.19, 0.70) p = 0.0022. Automated machine learning (AutoML) may enable clinician-driven automated discovery of novel insights and disease biomarkers.

Subjects

Subjects :
Medicine
Science

Details

Language :
English
ISSN :
20452322
Volume :
11
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
edsdoj.79d88f9f4344e5db04de21644fcea1e
Document Type :
article
Full Text :
https://doi.org/10.1038/s41598-021-89743-x