Back to Search Start Over

PFEMed: Few-shot medical image classification using prior guided feature enhancement.

Authors :
Dai, Zhiyong
Yi, Jianjun
Yan, Lei
Xu, Qingwen
Hu, Liang
Zhang, Qi
Li, Jiahui
Wang, Guoqiang
Source :
Pattern Recognition. Feb2023, Vol. 134, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

• A novel dual-encoder architecture is introduced to extract feature representation. • To our knowledge, we are the first to investigate the proposed VAE model. • We present a novel method to initialize the priors estimated in the VAE module. • Proposed approach will help medical industry utilize knowledge from public datasets. Deep learning-based methods have recently demonstrated outstanding performance on general image classification tasks. As optimization of these methods is dependent on a large amount of labeled data, their application in medical image classification is limited. To address this issue, we propose PFEMed, a novel few-shot classification method for medical images. To extract general and specific features from medical images, this method employs a dual-encoder structure, that is, one encoder with fixed weights pre-trained on public image classification datasets and another encoder trained on the target medical dataset. In addition, we introduce a novel prior-guided Variational Autoencoder (VAE) module to enhance the robustness of the target feature, which is the concatenation of the general and specific features. Then, we match the target features extracted from both the support and query medical image samples and predict the category attribution of the query examples. Extensive experiments on several publicly available medical image datasets show that our method outperforms current state-of-the-art few-shot methods by a wide margin, particularly outperforming MetaMed on the Pap smear dataset by over 2.63%. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
134
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
160172341
Full Text :
https://doi.org/10.1016/j.patcog.2022.109108