Back to Search Start Over

A measure of the information content of EIT data.

Authors :
Adler A
Youmaran R
Lionheart WR
Source :
Physiological measurement [Physiol Meas] 2008 Jun; Vol. 29 (6), pp. S101-9. Date of Electronic Publication: 2008 Jun 10.
Publication Year :
2008

Abstract

We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainty is quantified by the change from the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, (Sigma(n)), and a prior model of the image element covariances (Sigma(x)), we calculate IM = 1/2 summation operator log(2)([SNR](i) + 1), where [SNR](i) is the signal-to-noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM =179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.

Details

Language :
English
ISSN :
0967-3334
Volume :
29
Issue :
6
Database :
MEDLINE
Journal :
Physiological measurement
Publication Type :
Academic Journal
Accession number :
18544803
Full Text :
https://doi.org/10.1088/0967-3334/29/6/S09