Back to Search Start Over

Interactive 3D U-net for the segmentation of the pancreas in computed tomography scans

Authors :
T G W Boers
F. van der Heijden
Dean C. Barratt
Henkjan J. Huisman
Eli Gibson
Jasenko Krdzalic
J J Hermans
Yipeng Hu
Ester Bonmati
Digital Society Institute
Robotics and Mechatronics
Source :
Physics in Medicine and Biology, 65, 6, Physics in Medicine and Biology, 65, Physics in medicine and biology, 65(6):065002. Institute of Physics (IOP)
Publication Year :
2020

Abstract

The increasing incidence of pancreatic cancer will make it the second deadliest cancer in 2030. Imaging based early diagnosis and image guided treatment are emerging potential solutions. Artificial intelligence (AI) can help provide and improve widespread diagnostic expertise and accurate interventional image interpretation. Accurate segmentation of the pancreas is essential to create annotated data sets to train AI, and for computer assisted interventional guidance. Automated deep learning segmentation performance in pancreas computed tomography (CT) imaging is low due to poor grey value contrast and complex anatomy. A good solution seemed a recent interactive deep learning segmentation framework for brain CT that helped strongly improve initial automated segmentation with minimal user input. This method yielded no satisfactory results for pancreas CT, possibly due to a sub-optimal neural network architecture. We hypothesize that a state-of-the-art U-net neural network architecture is better because it can produce a better initial segmentation and is likely to be extended to work in a similar interactive approach. We implemented the existing interactive method, iFCN, and developed an interactive version of U-net method we call iUnet. The iUnet is fully trained to produce the best possible initial segmentation. In interactive mode it is additionally trained on a partial set of layers on user generated scribbles. We compare initial segmentation performance of iFCN and iUnet on a 100CT dataset using dice similarity coefficient analysis. Secondly, we assessed the performance gain in interactive use with three observers on segmentation quality and time. Average automated baseline performance was 78% (iUnet) versus 72% (FCN). Manual and semi-automatic segmentation performance was: 87% in 15 min. for manual, and 86% in 8 min. for iUNet. We conclude that iUnet provides a better baseline than iFCN and can reach expert manual performance significantly faster than manual segmentation in case of pancreas CT. Our novel iUnet architecture is modality and organ agnostic and can be a potential novel solution for semi-automatic medical imaging segmentation in general.

Details

ISSN :
00319155
Volume :
65
Database :
OpenAIRE
Journal :
Physics in Medicine and Biology
Accession number :
edsair.doi.dedup.....9ff7c309ab07bee6c81c6da1a46f46a2