Back to Search Start Over

Multi-level Kronecker Convolutional Neural Network (ML-KCNN) for Glioma Segmentation from Multi-modal MRI Volumetric Data

Authors :
Muhammad Junaid Ali
Basit Raza
Ahmad Raza Shahid
Source :
J Digit Imaging
Publication Year :
2020

Abstract

The development of an automated glioma segmentation system from MRI volumes is a difficult task because of data imbalance problem. The ability of deep learning models to incorporate different layers for data representation assists medical experts like radiologists to recognize the condition of the patient and further make medical practices easier and automatic. State-of-the-art deep learning algorithms enable advancement in the medical image segmentation area, such a segmenting the volumes into sub-tumor classes. For this task, fully convolutional network (FCN)-based architectures are used to build end-to-end segmentation solutions. In this paper, we proposed a multi-level Kronecker convolutional neural network (MLKCNN) that captures information at different levels to have both local and global level contextual information. Our ML-KCNN uses Kronecker convolution, which overcomes the missing pixels problem by dilated convolution. Moreover, we used a post-processing technique to minimize false positive from segmented outputs, and the generalized dice loss (GDL) function handles the data-imbalance problem. Furthermore, the combination of connected component analysis (CCA) with conditional random fields (CRF) used as a post-processing technique achieves reduced Hausdorff distance (HD) score of 3.76 on enhancing tumor (ET), 4.88 on whole tumor (WT), and 5.85 on tumor core (TC). Dice similarity coefficient (DSC) of 0.74 on ET, 0.90 on WT, and 0.83 on TC. Qualitative and visual evaluation of our proposed method shown effectiveness of the proposed segmentation method can achieve performance that can compete with other brain tumor segmentation techniques.

Details

ISSN :
1618727X
Volume :
34
Issue :
4
Database :
OpenAIRE
Journal :
Journal of digital imaging
Accession number :
edsair.doi.dedup.....1c1bb07a035731795f06cc1b881b65c6