1. A note on the convergence rate of MCMC for robust Bayesian multivariate linear regression with proper priors.
- Author
-
Backlund, Grant and Hobert, James P.
- Subjects
REGRESSION analysis ,GAUSSIAN integers ,DENSITY ,ALGORITHMS ,MARKOV processes - Abstract
The multivariate linear regression model with errors from a scale mixture of Gaussian densities yields a complex likelihood function. Combining this likelihood with any nontrivial prior distribution leads to a highly intractable posterior density. If a conditionally conjugate prior is used, then there is a well known and easy‐to‐implement data augmentation (DA) algorithm available for exploring the posterior. Hobert et al recently showed that, under an improper conditionally conjugate prior (and weak regularity conditions), the Markov chain that drives the DA algorithm converges at a geometric rate. Unfortunately, the model studied by Hobert et al can only be used in situations where the X matrix has full column rank. In this note, analogous convergence rate results are established for a proper conditionally conjugate prior. An important advantage of using a proper prior is that, not only is the X matrix allowed to be column rank deficient, but it can also have more columns than rows, that is, our model is applicable in cases where p>n. This is an important extension in the era of big data. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF