Back to Search
Start Over
Fully Automatic Head and Neck Cancer Prognosis Prediction in PET/CT
- Source :
- 11th International Workshop on Multimodal Learning for Clinical Decision Support, ML-CDS 2021, held in conjunction with 24th International Conference on Medical Imaging and Computer-Assisted Intervention, MICCAI 2021, 11th International Workshop on Multimodal Learning for Clinical Decision Support, ML-CDS 2021, held in conjunction with 24th International Conference on Medical Imaging and Computer-Assisted Intervention, MICCAI 2021, Oct 2021, Strasbourg, France. pp.59-68, ⟨10.1007/978-3-030-89847-2_6⟩, Multimodal Learning for Clinical Decision Support ISBN: 9783030898465, ML-CDS@MICCAI
- Publication Year :
- 2021
- Publisher :
- HAL CCSD, 2021.
-
Abstract
- International audience; Several recent PET/CT radiomics studies have shown promising results for the prediction of patient outcomes in Head and Neck (HandN) cancer. These studies, however, are most often conducted on relatively small cohorts (up to 300 patients) and using manually delineated tumors. Recently, deep learning reached high performance in the automatic segmentation of HandN primary tumors in PET/CT. The automatic segmentation could be used to validate these studies on larger-scale cohorts while obviating the burden of manual delineation. We propose a complete PET/CT processing pipeline gathering the automatic segmentation of primary tumors and prognosis prediction of patients with HandN cancer treated with radiotherapy and chemotherapy. Automatic contours of the primary Gross Tumor Volume (GTVt) are obtained from a 3D UNet. A radiomics pipeline that automatically predicts the patient outcome (Disease Free Survival, DFS) is compared when using either the automatically or the manually annotated contours. In addition, we extract deep features from the bottleneck layers of the 3D UNet to compare them with standard radiomics features (first- and second-order as well as shape features) and to test the performance gain when added to them. The models are evaluated on the HECKTOR 2020 dataset consisting of 239 HandN patients with PET, CT, GTVt contours and DFS data available (five centers). Regarding the results, using Hand-Crafted (HC) radiomics features extracted from manual GTVt achieved the best performance and is associated with an average Concordance (C) index of 0.672. The fully automatic pipeline (including deep and HC features from automatic GTVt) achieved an average C index of 0.626, which is lower but relatively close to using manual GTVt (p-value = 0.20). This suggests that large-scale studies could be conducted using a fully automatic pipeline to further validate the current state of the art HandN radiomics. The code will be shared publicly for reproducibility. © 2021, Springer Nature Switzerland AG.
- Subjects :
- Prognosis prediction
PET-CT
Radiomics
business.industry
Computer science
Pipeline (computing)
medicine.medical_treatment
Concordance
Head and neck cancer
Deep learning
medicine.disease
3. Good health
030218 nuclear medicine & medical imaging
Radiation therapy
03 medical and health sciences
0302 clinical medicine
030220 oncology & carcinogenesis
Fully automatic
medicine
[SDV.IB]Life Sciences [q-bio]/Bioengineering
Distributed File System
Nuclear medicine
business
Subjects
Details
- Language :
- English
- ISBN :
- 978-3-030-89846-5
- ISBNs :
- 9783030898465
- Database :
- OpenAIRE
- Journal :
- 11th International Workshop on Multimodal Learning for Clinical Decision Support, ML-CDS 2021, held in conjunction with 24th International Conference on Medical Imaging and Computer-Assisted Intervention, MICCAI 2021, 11th International Workshop on Multimodal Learning for Clinical Decision Support, ML-CDS 2021, held in conjunction with 24th International Conference on Medical Imaging and Computer-Assisted Intervention, MICCAI 2021, Oct 2021, Strasbourg, France. pp.59-68, ⟨10.1007/978-3-030-89847-2_6⟩, Multimodal Learning for Clinical Decision Support ISBN: 9783030898465, ML-CDS@MICCAI
- Accession number :
- edsair.doi.dedup.....dc68bbd7313d9e6e402473892aac2c0c
- Full Text :
- https://doi.org/10.1007/978-3-030-89847-2_6⟩