Back to Search Start Over

On Lower Bounds for Nonstandard Deterministic Estimation.

Authors :
Kbayer, Nabil
Galy, Jerome
Chaumette, Eric
Vincent, Francois
Renaux, Alexandre
Larzabal, Pascal
Source :
IEEE Transactions on Signal Processing. Mar2017, Vol. 65 Issue 6, p1538-1553. 16p.
Publication Year :
2017

Abstract

We consider deterministic parameter estimation and the situation where the probability density function (p.d.f.) parameterized by unknown deterministic parameters results from the marginalization of a joint p.d.f. depending on random variables as well. Unfortunately, in the general case, this marginalization is mathematically intractable, which prevents from using the known standard deterministic lower bounds (LBs) on the mean squared error (MSE). Actually the general case can be tackled by embedding the initial observation space in a hybrid one where any standard LB can be transformed into a modified one fitted to nonstandard deterministic estimation, at the expense of tightness however. Furthermore, these modified LBs (MLBs) appears to include the submatrix of hybrid LBs which is an LB for the deterministic parameters. Moreover, since in the nonstandard estimation, maximum likelihood estimators (MLEs) can be no longer derived, suboptimal nonstandard MLEs (NSMLEs) are proposed as being a substitute. We show that any standard LB on the MSE of MLEs has a nonstandard version lower bounding the MSE of NSMLEs. We provide an analysis of the relative performance of the NSMLEs, as well as a comparison with the MLBs for a large class of estimation problems. Last, the general approach introduced is exemplified, among other things, with a new look at the well-known Gaussian complex observation models. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
1053587X
Volume :
65
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
124145974
Full Text :
https://doi.org/10.1109/TSP.2016.2645538