Back to Search
Start Over
IGU-Aug: Information-guided unsupervised augmentation and pixel-wise contrastive learning for medical image analysis.
- Source :
-
IEEE transactions on medical imaging [IEEE Trans Med Imaging] 2024 Aug 01; Vol. PP. Date of Electronic Publication: 2024 Aug 01. - Publication Year :
- 2024
- Publisher :
- Ahead of Print
-
Abstract
- Contrastive learning (CL) is a form of self-supervised learning and has been widely used for various tasks. Different from widely studied instance-level contrastive learning, pixel-wise contrastive learning mainly helps with pixel-wise dense prediction tasks. The counter-part to an instance in instance-level CL is a pixel, along with its neighboring context, in pixel-wise CL. Aiming to build better feature representation, there is a vast literature about designing instance augmentation strategies for instance-level CL; but there is little similar work on pixel augmentation for pixel-wise CL with a pixel granularity. In this paper, we attempt to bridge this gap. We first classify a pixel into three categories, namely low-, medium-, and high-informative, based on the information quantity the pixel contains. We then adaptively design separate augmentation strategies for each category in terms of augmentation intensity and sampling ratio. Extensive experiments validate that our information-guided pixel augmentation strategy succeeds in encoding more discriminative representations and surpassing other competitive approaches in unsupervised local feature matching. Furthermore, our pretrained model improves the performance of both one-shot and fully supervised models. To the best of our knowledge, we are the first to propose a pixel augmentation method with a pixel granularity for enhancing unsupervised pixel-wise contrastive learning. Code is available at https: //github.com/Curli-quan/IGU-Aug.
Details
- Language :
- English
- ISSN :
- 1558-254X
- Volume :
- PP
- Database :
- MEDLINE
- Journal :
- IEEE transactions on medical imaging
- Publication Type :
- Academic Journal
- Accession number :
- 39088491
- Full Text :
- https://doi.org/10.1109/TMI.2024.3436713