Back to Search
Start Over
Learning Gaussian Mixture Models With Entropy-Based Criteria
- Source :
- IEEE Transactions on Neural Networks. 20:1756-1771
- Publication Year :
- 2009
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2009.
-
Abstract
- In this paper, we address the problem of estimating the parameters of Gaussian mixture models. Although the expectation-maximization (EM) algorithm yields the maximum-likelihood (ML) solution, its sensitivity to the selection of the starting parameters is well-known and it may converge to the boundary of the parameter space. Furthermore, the resulting mixture depends on the number of selected components, but the optimal number of kernels may be unknown beforehand. We introduce the use of the entropy of the probability density function (pdf) associated to each kernel to measure the quality of a given mixture model with a fixed number of kernels. We propose two methods to approximate the entropy of each kernel and a modification of the classical EM algorithm in order to find the optimum number of components of the mixture. Moreover, we use two stopping criteria: a novel global mixture entropy-based criterion called Gaussianity deficiency (GD) and a minimum description length (MDL) principle-based one. Our algorithm, called entropy-based EM (EBEM), starts with a unique kernel and performs only splitting by selecting the worst kernel attending to GD. We have successfully tested it in probability density estimation, pattern classification, and color image segmentation. Experimental results improve the ones of other state-of-the-art model order selection methods.
- Subjects :
- Computer Networks and Communications
Entropy
Normal Distribution
Entropy estimation
Mixture theory
symbols.namesake
Artificial Intelligence
Expectation–maximization algorithm
Applied mathematics
Entropy (information theory)
Gaussian process
Mathematics
Models, Statistical
business.industry
Pattern recognition
General Medicine
Density estimation
Mixture model
Computer Science Applications
Variable kernel density estimation
Data Interpretation, Statistical
symbols
Neural Networks, Computer
Artificial intelligence
business
Algorithms
Software
Subjects
Details
- ISSN :
- 19410093 and 10459227
- Volume :
- 20
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Neural Networks
- Accession number :
- edsair.doi.dedup.....527b7d3e855be4393e11f0388005b584