1. Robust Bayesian model selection for heavy-tailed linear regression using finite mixtures
- Author
-
Marcos O. Prates, Victor H. Lachos, and Flávio B. Gonçalves
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,MCMC ,Posterior probability ,Bayesian inference ,01 natural sciences ,Methodology (stat.ME) ,010104 statistics & probability ,symbols.namesake ,Student-t ,0502 economics and business ,Linear regression ,Applied mathematics ,Scale mixtures of normal ,0101 mathematics ,Statistics - Methodology ,050205 econometrics ,Mathematics ,slash ,Markov chain ,Model selection ,05 social sciences ,Linear model ,Markov chain Monte Carlo ,Statistics::Computation ,symbols ,penalised complexity priors ,Gibbs sampling - Abstract
In this paper we present a novel methodology to perform Bayesian model selection in linear models with heavy-tailed distributions. We consider a finite mixture of distributions to model a latent variable where each component of the mixture corresponds to one possible model within the symmetrical class of normal independent distributions. Naturally, the Gaussian model is one of the possibilities. This allows for a simultaneous analysis based on the posterior probability of each model. Inference is performed via Markov chain Monte Carlo - a Gibbs sampler with Metropolis-Hastings steps for a class of parameters. Simulated examples highlight the advantages of this approach compared to a segregated analysis based on arbitrarily chosen model selection criteria. Examples with real data are presented and an extension to censored linear regression is introduced and discussed.
- Published
- 2020
- Full Text
- View/download PDF