1. Bayesian Estimation of Dynamic Discrete Choice Models
- Author
-
Imai, Susumu, Jain, Neelam, and Ching, Andrew
- Subjects
Monte Carlo method -- Analysis ,Algorithms -- Analysis ,Markov processes -- Analysis ,Algorithm ,Business ,Economics ,Mathematics - Abstract
To authenticate to the full-text of this article, please visit this link: http://dx.doi.org/10.3982/ECTA5658 Byline: Susumu Imai (*), Neelam Jain (**), Andrew Ching (***) Keywords: Bayesian estimation; dynamic programming; discrete choice models; Markov chain Monte Carlo Abstract: We propose a new methodology for structural estimation of infinite horizon dynamic discrete choice models. We combine the dynamic programming (DP) solution algorithm with the Bayesian Markov chain Monte Carlo algorithm into a single algorithm that solves the DP problem and estimates the parameters simultaneously. As a result, the computational burden of estimating a dynamic model becomes comparable to that of a static model. Another feature of our algorithm is that even though the number of grid points on the state variable is small per solution-estimation iteration, the number of effective grid points increases with the number of estimation iterations. This is how we help ease the 'curse of dimensionality.' We simulate and estimate several versions of a simple model of entry and exit to illustrate our methodology. We also prove that under standard conditions, the parameters converge in probability to the true posterior distribution, regardless of the starting values. Author Affiliation: (*)Dept. of Economics, Queen's University, 233 Dunning Hall, 94 University Avenue, Kingston, ON K7L 5M2, Canada;imais@econ.gueensu.ca (**)Dept. of Economics, City University London, Northampton Square, London EC1V 0HB, U.K. and Dept. of Economics, Northern Illinois University, 508 Zulauf Hall, DeKalb, IL 60115, U.S.A.;njain@niu.edu (***)Rotman School of Management, University of Toronto, 105 St. George Street, Toronto, ON M5S 3E6, Canada;andrew.ching@rotman.utoronto.ca Article History: Manuscript received January, 2005; final revision received May, 2009.
- Published
- 2009