1. DAE-CNN: Exploiting and disentangling contrast agent effects for breast lesions classification in DCE-MRI
- Author
-
Michela Gravina, Mario Sansone, Stefano Marrone, Carlo Sansone, Gravina, Michela, Marrone, Stefano, Sansone, Mario, and Sansone, Carlo
- Subjects
Computer science ,business.industry ,Contrast (statistics) ,Pattern recognition ,02 engineering and technology ,01 natural sciences ,Autoencoder ,Convolutional neural network ,Image (mathematics) ,Artificial Intelligence ,0103 physical sciences ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,Medical imaging ,Key (cryptography) ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Artificial intelligence ,010306 general physics ,business ,Software ,Generator (mathematics) - Abstract
Convolutional Neural Networks (CNNs) are opening for unprecedented scenarios in fields where designing effective features is tedious even for domain experts. This is the case of medical imaging, i.e. procedures acquiring images of a human body interior for clinical proposes. Despite promising, we argue that CNNs naive use may not be effective since “medical images are more than pictures”. A notable example is breast Dynamic Contrast Enhanced-Magnetic Resonance Imaging (DCE-MRI), in which the kinetic of the injected Contrast Agent (CA) is crucial for lesion classification purposes. Therefore, in this work we introduce a new GAN like approach designed to simultaneously learn how to disentangle the CA effects from all the other image components while performing the lesion classification: the generator is an intrinsic Deforming Autoencoder (DAE), while the discriminator is a CNN. We compared the performance of the proposed approach against some literature proposals (both classical and CNN based) using patient-wise cross-validation. Finally, for the sake of completeness, we also analyzed the impact of variations in some key aspect of the proposed solution. Results not only show the effectiveness of our approach ( + 8 % AUC w.r.t. the runner-up) but also confirm that all the approach’s components effectively contribute to the solution.
- Published
- 2021