Back to Search Start Over

Automatic Segmentation of MRI of Brain Tumor Using Deep Convolutional Network.

Authors :
Zhou, Runwei
Hu, Shijun
Ma, Baoxiang
Ma, Bangcheng
Source :
BioMed Research International; 6/15/2022, p1-9, 9p
Publication Year :
2022

Abstract

Computer-aided diagnosis and treatment of multimodal magnetic resonance imaging (MRI) brain tumor image segmentation has always been a hot and significant topic in the field of medical image processing. Multimodal MRI brain tumor image segmentation utilizes the characteristics of each modal in the MRI image to segment the entire tumor and tumor core area and enhanced them from normal brain tissues. However, the grayscale similarity between brain tissues in various MRI images is very immense making it difficult to deal with the segmentation of multimodal MRI brain tumor images through traditional algorithms. Therefore, we employ the deep learning method as a tool to make full use of the complementary feature information between the multimodalities and instigate the following research: (i) build a network model suitable for brain tumor segmentation tasks based on the fully convolutional neural network framework and (ii) adopting an end-to-end training method, using two-dimensional slices of MRI images as network input data. The problem of unbalanced categories in various brain tumor image data is overcome by introducing the Dice loss function into the network to calculate the network training loss; at the same time, parallel Dice loss is proposed to further improve the substructure segmentation effect. We proposed a cascaded network model based on a fully convolutional neural network to improve the tumor core area and enhance the segmentation accuracy of the tumor area and achieve good prediction results for the substructure segmentation on the BraTS 2017 data set. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
23146133
Database :
Complementary Index
Journal :
BioMed Research International
Publication Type :
Academic Journal
Accession number :
157456364
Full Text :
https://doi.org/10.1155/2022/4247631