Back to Search Start Over

Classification of diabetic retinopathy using unlabeled data and knowledge distillation.

Authors :
Abbasi S
Hajabdollahi M
Khadivi P
Karimi N
Roshandel R
Shirani S
Samavi S
Source :
Artificial intelligence in medicine [Artif Intell Med] 2021 Nov; Vol. 121, pp. 102176. Date of Electronic Publication: 2021 Sep 17.
Publication Year :
2021

Abstract

Over the last decade, advances in Machine Learning and Artificial Intelligence have highlighted their potential as a diagnostic tool in the healthcare domain. Despite the widespread availability of medical images, their usefulness is severely hampered by a lack of access to labeled data. For example, while Convolutional Neural Networks (CNNs) have emerged as an essential analytical tool in image processing, their impact is curtailed by training limitations due to insufficient labeled data availability. Transfer Learning enables models developed for one task to be reused for a second task. Knowledge distillation enables transferring knowledge from a pre-trained model to another. However, it suffers from limitations, and the two models' constraints need to be architecturally similar. Knowledge distillation addresses some of the shortcomings of transfer learning by generalizing a complex model to a lighter model. However, some parts of the knowledge may not be distilled by knowledge distillation sufficiently. In this paper, a novel knowledge distillation approach using transfer learning is proposed. The proposed approach transfers the complete knowledge of a model to a new smaller one. Unlabeled data are used in an unsupervised manner to transfer the new smaller model's maximum amount of knowledge. The proposed method can be beneficial in medical image analysis, where labeled data are typically scarce. The proposed approach is evaluated in classifying images for diagnosing Diabetic Retinopathy on two publicly available datasets, including Messidor and EyePACS. Simulation results demonstrate that the approach effectively transfers knowledge from a complex model to a lighter one. Furthermore, experimental results illustrate that different small models' performance is improved significantly using unlabeled data and knowledge distillation.<br /> (Copyright © 2021 Elsevier B.V. All rights reserved.)

Details

Language :
English
ISSN :
1873-2860
Volume :
121
Database :
MEDLINE
Journal :
Artificial intelligence in medicine
Publication Type :
Academic Journal
Accession number :
34763798
Full Text :
https://doi.org/10.1016/j.artmed.2021.102176