Back to Search Start Over

Pretrained Deep 2.5D Models for Efficient Predictive Modeling from Retinal OCT

Authors :
Emre, Taha
Oghbaie, Marzieh
Chakravarty, Arunava
Rivail, Antoine
Riedl, Sophie
Mai, Julia
Scholl, Hendrik P. N.
Sivaprasad, Sobha
Rueckert, Daniel
Lotery, Andrew
Schmidt-Erfurth, Ursula
Bogunović, Hrvoje
Publication Year :
2023

Abstract

In the field of medical imaging, 3D deep learning models play a crucial role in building powerful predictive models of disease progression. However, the size of these models presents significant challenges, both in terms of computational resources and data requirements. Moreover, achieving high-quality pretraining of 3D models proves to be even more challenging. To address these issues, hybrid 2.5D approaches provide an effective solution for utilizing 3D volumetric data efficiently using 2D models. Combining 2D and 3D techniques offers a promising avenue for optimizing performance while minimizing memory requirements. In this paper, we explore 2.5D architectures based on a combination of convolutional neural networks (CNNs), long short-term memory (LSTM), and Transformers. In addition, leveraging the benefits of recent non-contrastive pretraining approaches in 2D, we enhanced the performance and data efficiency of 2.5D techniques even further. We demonstrate the effectiveness of architectures and associated pretraining on a task of predicting progression to wet age-related macular degeneration (AMD) within a six-month period on two large longitudinal OCT datasets.<br />Comment: Accepted at OMIA-X MICCAI'23 Workshop

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2307.13865
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/978-3-031-44013-7_14