Hostname: page-component-848d4c4894-8bljj Total loading time: 0 Render date: 2024-07-03T12:47:31.985Z Has data issue: false hasContentIssue false

OPTIMAL AUXILIARY PRIORS AND REVERSIBLE JUMP PROPOSALS FOR A CLASS OF VARIABLE DIMENSION MODELS

Published online by Cambridge University Press:  27 April 2020

Andriy Norets*
Affiliation:
Department of Economics, Brown University, Providence, RI02912
*
Address correspondence to Andriy Norets; e-mail: andriy_norets@brown.edu.

Abstract

This article develops a Markov chain Monte Carlo (MCMC) method for a class of models that encompasses finite and countable mixtures of densities and mixtures of experts with a variable number of mixture components. The method is shown to maximize the expected probability of acceptance for cross-dimensional moves and to minimize the asymptotic variance of sample average estimators under certain restrictions. The method can be represented as a retrospective sampling algorithm with an optimal choice of auxiliary priors and as a reversible jump algorithm with optimal proposal distributions. The method is primarily motivated by and applied to a Bayesian nonparametric model for conditional densities based on mixtures of a variable number of experts. The mixture of experts model outperforms standard parametric and nonparametric alternatives in out of sample performance comparisons in an application to Engel curve estimation. The proposed MCMC algorithm makes estimation of this model practical.

Type
ARTICLES
Copyright
© Cambridge University Press 2020

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Footnotes

*

I thank participants of MCMCSki conferences and seminars at University of Pennsylvania, Chicago, Princeton, and Yale for helpful discussions. I thank editors and anonymous referees for comments that helped to improve the manuscript. The support from the NSF Award SES-1851796 is gratefully acknowledged.

References

REFERENCES

Battistin, E. & Nadai, M.D. (2015) Identification and estimation of Engel curves with endogenous and unobserved expenditures. Journal of Applied Econometrics 30, 487508.CrossRefGoogle Scholar
Brooks, S.P., Giudici, P., & Roberts, G.O. (2003) Efficient construction of reversible jump Markov chain Monte Carlo proposal distributions. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 65, 339.CrossRefGoogle Scholar
Carlin, B.P. & Chib, S. (1995) Bayesian model choice via Markov chain Monte Carlo methods. Journal of the Royal Statistical Society Series B (Methodological) 57, 473484.CrossRefGoogle Scholar
Chen, T.L. (2013) Optimal Markov chain Monte Carlo sampling. Wiley Interdisciplinary Reviews: Computational Statistics 5, 341348.CrossRefGoogle Scholar
Chen, X., Ponomareva, M., & Tamer, E. (2014) Likelihood inference in some finite mixture models. Journal of Econometrics 182, 8799.CrossRefGoogle Scholar
Chernozhukov, V. & Hong, H. (2003) An MCMC approach to classical estimation. Journal of Econometrics 115, 293346.CrossRefGoogle Scholar
Diebolt, J. & Robert, C.P. (1994) Estimation of finite mixture distributions through Bayesian sampling. Journal of the Royal Statistical Society Series B (Methodological) 56, 363375.CrossRefGoogle Scholar
Fruhwirth-Schnatter, S. (2006)Finite Mixture and Markov Switching Models (Springer Series in Statistics), 1st edn. Springer.Google Scholar
Geweke, J. (2004) Getting it right: Joint distribution tests of posterior simulators. Journal of the American Statistical Association 99, 799804.CrossRefGoogle Scholar
Geweke, J. (2005) Contemporary Bayesian Econometrics and Statistics. Wiley.CrossRefGoogle Scholar
Geweke, J. (2007) Interpretation and inference in mixture models: Simple mcmc works. Computational Statistics and Data Analysis 51, 35293550.CrossRefGoogle Scholar
Geweke, J. & Keane, M. (2007) Smoothly mixing regressions. Journal of Econometrics 138, 252290.CrossRefGoogle Scholar
Geyer, J.C. (2005) Markov chain Monte Carlo lecture notes (unpublished).Google Scholar
Gilks, W.R., Best, N.G., & Tan, K.K.C. (1995) Adaptive rejection metropolis sampling within Gibbs sampling. Journal of the Royal Statistical Society Series C (Applied Statistics) 44, 455472.Google Scholar
Green, P.J. (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82, 711732.CrossRefGoogle Scholar
Hall, P., Racine, J., & Li, Q. (2004) Cross-validation and the estimation of conditional probability densities. Journal of the American Statistical Association 99, 10151026.CrossRefGoogle Scholar
Hastie, D.I. & Green, P.J. (2012) Model choice using reversible jump Markov chain Monte Carlo. Statistica Neerlandica 66, 309338.CrossRefGoogle Scholar
Hayfield, T. & Racine, J.S. (2008) Nonparametric econometrics: The np package. Journal of Statistical Software 27, 132.CrossRefGoogle Scholar
Jacobs, R.A., Jordan, M.I., Nowlan, S.J., & Hinton, G.E. (1991) Adaptive mixtures of local experts. Neural Computation 3, 7987.CrossRefGoogle ScholarPubMed
Jordan, M. & Xu, L. (1995) Convergence results for the EM approach to mixtures of experts architectures. Neural Networks 8, 14091431.CrossRefGoogle Scholar
Jordan, M.I. & Jacobs, R.A. (1994) Hierarchical mixtures of experts and the EM algorithm. Neural Computation 6, 181214.CrossRefGoogle Scholar
Kleijn, B. & van der Vaart, A. (2012) The Bernstein–von Mises theorem under misspecification. Electronic Journal of Statistics 6, 354381.CrossRefGoogle Scholar
Lewbel, A. (2008) Engel curve. In Durlauf, S.N. and Blume, L.E. (eds.). The New Palgrave Dictionary of Economics. Palgrave Macmillan, Basingstoke, UK.Google Scholar
McLachlan, G. & Peel, D. (2000)Finite Mixture Models. Wiley.CrossRefGoogle Scholar
Norets, A. & Pati, D. (2017) Adaptive Bayesian estimation of conditional densities. Econometric Theory 33, 9801012.CrossRefGoogle Scholar
Papaspiliopoulos, O. & Roberts, G.O. (2008) Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika 95, 169186.CrossRefGoogle Scholar
Peng, F., Jacobs, R.A., & Tanner, M.A. (1996) Bayesian inference in mixtures-of-experts and hierarchical mixtures-of-experts models with an application to speech recognition. Journal of the American Statistical Association 91, 953960.CrossRefGoogle Scholar
Peskun, P.H. (1973) Optimum Monte-Carlo sampling using Markov chains. Biometrika 60, 607612.CrossRefGoogle Scholar
Richardson, S. & Green, P.J. (1997) On Bayesian analysis of mixtures with an unknown number of components (with discussion). Journal of the Royal Statistical Society: Series B (Statistical Methodology) 59, 731792.CrossRefGoogle Scholar
Stephens, M. (2000) Bayesian analysis of mixture models with an unknown number of components—An alternative to reversible jump methods. Annals of Statistics 28, 4074.CrossRefGoogle Scholar
Tierney, L. (1998) A note on metropolis-hastings kernels for general state spaces. The Annals of Applied Probability 8, 19.Google Scholar
Villani, M., Kohn, R., & Giordani, P. (2009) Regression density estimation using smooth adaptive Gaussian mixtures. Journal of Econometrics 153, 155173.CrossRefGoogle Scholar
Wood, S., Jiang, W., & Tanner, M. (2002) Bayesian mixture of splines for spatially adaptive nonparametric regression. Biometrika 89, 513528.CrossRefGoogle Scholar