Back to Search Start Over

Consistency of sequence classification with entropic priors.

Authors :
Palmieri, Francesco A. N.
Ciuonzo, Domenico
Source :
AIP Conference Proceedings. 5/3/2012, Vol. 1443 Issue 1, p338-345. 8p. 1 Graph.
Publication Year :
2012

Abstract

Entropic priors, recently revisited within the context of theoretical physics, were originally introduced for image processing and for general statistical inference. Entropic priors seem to represent a very promising approach to objective prior determination when such information is not available. The attention has been mostly limited to continuous parameter spaces and our focus in this work is on the application of the entropic prior idea to Bayesian inference with discrete classes in signal processing problems. Unfortunately, it is well known that entropic priors, when applied to sequences, may lead to excessive spreading of the entropy as the number of samples grows. In this paper we show that the spreading of the entropy may be tolerated if the posterior probabilities remain consistent. We derive a condition based on conditional entropies and KL-divergences for posterior consistency using the Asymptotic Equipartition Property (AEP). Furthermore, we show that entropic priors can be modified to force posterior consistency by adding a constraint to joint entropy maximization. Simulations on the application of entropic priors to a coin flipping experiment are included. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0094243X
Volume :
1443
Issue :
1
Database :
Academic Search Index
Journal :
AIP Conference Proceedings
Publication Type :
Conference
Accession number :
74978559
Full Text :
https://doi.org/10.1063/1.3703652