Back to Search Start Over

Reducing Algorithm Complexity for Computing an Aggregate Uncertainty Measure.

Authors :
Chunsheng Liu
Grenier, Dominic
Jousselme, Anne-Laure
Bossé, Éloi
Source :
IEEE Transactions on Systems, Man & Cybernetics: Part A. Sep2007, Vol. 37 Issue 5, p669-679. 11p. 4 Black and White Photographs, 3 Graphs.
Publication Year :
2007

Abstract

In the theory of evidence, two kinds of uncertainty coexist, nonspecificity and discord. An aggregate uncertainty (AU) measure has been defined to include these two kinds of uncertainty, in an aggregate fashion. Meyerowitz et al. proposed an algorithm for calculating AU and validated its practical usage. Although this algorithm was proven to be absolutely correct by Klir and Wierman, in some cases, it remains too complex. In fact, when the cardinality of the frame of discernment is very large, it can be impossible to calculate AU. Therefore, based on Klir's and Harmanec's seminal work, we give some justifications for restricting the computation of AU(Bel) to the core of the corresponding belief function, and we also propose an algorithm to calculate AU(Bel), the F-algorithm, which reduces the computational complexity of the original algorithm of Meyerowitz et al. We prove that this algorithm gives the same results as Meyerowitz's algorithm, and we outline conditions under which it reduces the computational complexity significantly. Moreover, we illustrate the use of the F-algorithm in computing AU in a practical scenario of target identification. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10834427
Volume :
37
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Systems, Man & Cybernetics: Part A
Publication Type :
Academic Journal
Accession number :
26405167
Full Text :
https://doi.org/10.1109/TSMCA.2007.893457