Back to Search Start Over

FetusMapV2: Enhanced Fetal Pose Estimation in 3D Ultrasound

Authors :
Chen, Chaoyu
Yang, Xin
Huang, Yuhao
Shi, Wenlong
Cao, Yan
Luo, Mingyuan
Hu, Xindi
Zhue, Lei
Yu, Lequan
Yue, Kejuan
Zhang, Yuanji
Xiong, Yi
Ni, Dong
Huang, Weijun
Publication Year :
2023

Abstract

Fetal pose estimation in 3D ultrasound (US) involves identifying a set of associated fetal anatomical landmarks. Its primary objective is to provide comprehensive information about the fetus through landmark connections, thus benefiting various critical applications, such as biometric measurements, plane localization, and fetal movement monitoring. However, accurately estimating the 3D fetal pose in US volume has several challenges, including poor image quality, limited GPU memory for tackling high dimensional data, symmetrical or ambiguous anatomical structures, and considerable variations in fetal poses. In this study, we propose a novel 3D fetal pose estimation framework (called FetusMapV2) to overcome the above challenges. Our contribution is three-fold. First, we propose a heuristic scheme that explores the complementary network structure-unconstrained and activation-unreserved GPU memory management approaches, which can enlarge the input image resolution for better results under limited GPU memory. Second, we design a novel Pair Loss to mitigate confusion caused by symmetrical and similar anatomical structures. It separates the hidden classification task from the landmark localization task and thus progressively eases model learning. Last, we propose a shape priors-based self-supervised learning by selecting the relatively stable landmarks to refine the pose online. Extensive experiments and diverse applications on a large-scale fetal US dataset including 1000 volumes with 22 landmarks per volume demonstrate that our method outperforms other strong competitors.<br />Comment: 16 pages, 11 figures, accepted by Medical Image Analysis(2023)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.19293
Document Type :
Working Paper