1. Comparison of CT and Dixon MR Abdominal Adipose Tissue Quantification Using a Unified Computer-Assisted Software Framework
- Author
-
Li-Yueh Hsu, Zara Ali, Hadi Bagheri, Fahimul Huda, Bernadette A. Redd, and Elizabeth C. Jones
- Subjects
abdominal adipose tissue ,computed tomography ,magnetic resonance imaging ,fat quantification ,image segmentation ,Computer applications to medicine. Medical informatics ,R858-859.7 - Abstract
Purpose: Reliable and objective measures of abdominal fat distribution across imaging modalities are essential for various clinical and research scenarios, such as assessing cardiometabolic disease risk due to obesity. We aimed to compare quantitative measures of subcutaneous (SAT) and visceral (VAT) adipose tissues in the abdomen between computed tomography (CT) and Dixon-based magnetic resonance (MR) images using a unified computer-assisted software framework. Materials and Methods: This study included 21 subjects who underwent abdominal CT and Dixon MR imaging on the same day. For each subject, two matched axial CT and fat-only MR images at the L2-L3 and the L4-L5 intervertebral levels were selected for fat quantification. For each image, an outer and an inner abdominal wall regions as well as SAT and VAT pixel masks were automatically generated by our software. The computer-generated results were then inspected and corrected by an expert reader. Results: There were excellent agreements for both abdominal wall segmentation and adipose tissue quantification between matched CT and MR images. Pearson coefficients were 0.97 for both outer and inner region segmentation, 0.99 for SAT, and 0.97 for VAT quantification. Bland–Altman analyses indicated minimum biases in all comparisons. Conclusion: We showed that abdominal adipose tissue can be reliably quantified from both CT and Dixon MR images using a unified computer-assisted software framework. This flexible framework has a simple-to-use workflow to measure SAT and VAT from both modalities to support various clinical research applications.
- Published
- 2023
- Full Text
- View/download PDF