Back to Search Start Over

Quantitative Image Correction Using Semi- and Fully-automatic Segmentation of Hybrid Optoacoustic and Ultrasound Images

Authors :
Daniel Razansky
Xosé Luís Deán-Ben
Berkan Lafci
Elena Merčep
Source :
Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS).
Publication Year :
2018
Publisher :
OSA, 2018.

Abstract

Multispectral optoacoustic tomography (MSOT) is a fast-developing imaging modality, combining the high contrast from optical tissue excitation with the high resolution and penetration depth of ultrasound detection. Since light is subject to absorption and scattering when travelling through tissue, adequate knowledge of the spatial fluence distribution is required in order to ensure quantification accuracy of MSOT. In order to reduce the systematic errors in spectral recovery due to fluence and to provide a visually more homogeneous image, correction for fluence is commonly performed on reconstructed images using one of the state-of-the-art methods. These require, as input, information on illumination geometry (a priori known from the system design) as well as spatial reference of an object in a form of either a binary map (assuming uniform optical properties), or a label map, in a more complex scenario of multiple regions with different optical properties. In order to generate such a map, manual segmentation is commonly used by delineating the outer border of the mouse body or major organs present in the slice, which is a time-consuming procedure, not efficient procedure, prone to operator errors. Here we evaluate methods for semi- and fully-automatic segmentation of hybrid optoacoustic and ultrasound images and characterize the performance of the methods using quantitative metrics for evaluating medical image segmentation against the ground truth obtained by manual segmentation.

Details

Database :
OpenAIRE
Journal :
Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS)
Accession number :
edsair.doi...........b1dd0b3bb47acc5377ad77f9684da38a
Full Text :
https://doi.org/10.1364/translational.2018.jth3a.44