Back to Search
Start Over
A novel MM algorithm and the mode-sharing method in Bayesian computation for the analysis of general incomplete categorical data
- Source :
- Computational Statistics & Data Analysis. 140:122-143
- Publication Year :
- 2019
- Publisher :
- Elsevier BV, 2019.
-
Abstract
- Incomplete categorical data often occur in the fields such as biomedicine, epidemiology, psychology, sports and so on. In this paper, we first introduce a novel minorization–maximization (MM) algorithm to calculate the maximum likelihood estimates (MLEs) of parameters and the posterior modes for the analysis of general incomplete categorical data. Although the data augmentation (DA) algorithm and Gibbs sampling as the corresponding stochastic counterparts of the expectation–maximization (EM) and ECM algorithms are developed very well, up to now, little work has been done on creating stochastic versions to the existing MM algorithms. This is the first paper to propose a mode-sharing method in Bayesian computation for general incomplete categorical data by developing a new acceptance–rejection (AR) algorithm aided with the proposed MM algorithm. The key idea is to construct a class of envelope densities indexed by a working parameter and to identify a specific envelope density which can overcome the four drawbacks associated with the traditional AR algorithm. The proposed mode-sharing based AR algorithm has three significant characteristics: (I) it can automatically establish a family of envelope densities { g λ ( ⋅ ) : λ ∈ S λ } indexed by a working parameter λ , where each member in the family shares mode with the posterior density; (II) with the one-dimensional grid method searching over the finite interval S λ , it can identify an optimal working parameter λ opt by maximizing the theoretical acceptance probability, yielding a best easy-sampling envelope density g λ opt ( ⋅ ) , which is more dispersive than the posterior density; (III) it can obtain the optimal envelope constant c opt by using the mode-sharing theorem (indicating that the high-dimensional optimization can be completely avoided) or by using the proposed MM algorithm again. Finally, a toy model and three real data sets are used to illustrate the proposed methodologies.
- Subjects :
- Statistics and Probability
Applied Mathematics
Computation
05 social sciences
Bayesian probability
Mode (statistics)
Interval (mathematics)
01 natural sciences
010104 statistics & probability
Computational Mathematics
symbols.namesake
Computational Theory and Mathematics
0502 economics and business
symbols
0101 mathematics
Categorical variable
Algorithm
050205 econometrics
Gibbs sampling
Envelope (motion)
MM algorithm
Subjects
Details
- ISSN :
- 01679473
- Volume :
- 140
- Database :
- OpenAIRE
- Journal :
- Computational Statistics & Data Analysis
- Accession number :
- edsair.doi...........cac65b0b64b768af107f6978ddfb9e2e
- Full Text :
- https://doi.org/10.1016/j.csda.2019.04.012