1. HaN-Seg: The head and neck organ-at-risk CT and MR segmentation challenge.
- Author
-
Podobnik G, Ibragimov B, Tappeiner E, Lee C, Kim JS, Mesbah Z, Modzelewski R, Ma Y, Yang F, Rudecki M, Wodziński M, Peterlin P, Strojan P, and Vrtovec T
- Subjects
- Humans, Tomography, X-Ray Computed methods, Magnetic Resonance Imaging methods, Head and Neck Neoplasms diagnostic imaging, Head and Neck Neoplasms radiotherapy, Organs at Risk radiation effects, Radiotherapy Planning, Computer-Assisted methods
- Abstract
Background and Purpose: To promote the development of auto-segmentation methods for head and neck (HaN) radiation treatment (RT) planning that exploit the information of computed tomography (CT) and magnetic resonance (MR) imaging modalities, we organized HaN-Seg: The Head and Neck Organ-at-Risk CT and MR Segmentation Challenge., Materials and Methods: The challenge task was to automatically segment 30 organs-at-risk (OARs) of the HaN region in 14 withheld test cases given the availability of 42 publicly available training cases. Each case consisted of one contrast-enhanced CT and one T1-weighted MR image of the HaN region of the same patient, with up to 30 corresponding reference OAR delineation masks. The performance was evaluated in terms of the Dice similarity coefficient (DSC) and 95-percentile Hausdorff distance (HD
95 ), and statistical ranking was applied for each metric by pairwise comparison of the submitted methods using the Wilcoxon signed-rank test., Results: While 23 teams registered for the challenge, only seven submitted their methods for the final phase. The top-performing team achieved a DSC of 76.9 % and a HD95 of 3.5 mm. All participating teams utilized architectures based on U-Net, with the winning team leveraging rigid MR to CT registration combined with network entry-level concatenation of both modalities., Conclusion: This challenge simulated a real-world clinical scenario by providing non-registered MR and CT images with varying fields-of-view and voxel sizes. Remarkably, the top-performing teams achieved segmentation performance surpassing the inter-observer agreement on the same dataset. These results set a benchmark for future research on this publicly available dataset and on paired multi-modal image segmentation in general., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024 The Authors. Published by Elsevier B.V. All rights reserved.)- Published
- 2024
- Full Text
- View/download PDF