Skip to main content Accessibility help
×
Home
Hostname: page-component-559fc8cf4f-28jzs Total loading time: 0.266 Render date: 2021-02-24T18:57:06.572Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": false, "newCiteModal": false, "newCitedByModal": true }

Monte Carlo fusion

Published online by Cambridge University Press:  12 July 2019

Hongsheng Dai
Affiliation:
University of Essex
Murray Pollock
Affiliation:
University of Warwick
Gareth Roberts
Affiliation:
University of Warwick

Abstract

In this paper we propose a new theory and methodology to tackle the problem of unifying Monte Carlo samples from distributed densities into a single Monte Carlo draw from the target density. This surprisingly challenging problem arises in many settings (for instance, expert elicitation, multiview learning, distributed ‘big data’ problems, etc.), but to date the framework and methodology proposed in this paper (Monte Carlo fusion) is the first general approach which avoids any form of approximation error in obtaining the unified inference. In this paper we focus on the key theoretical underpinnings of this new methodology, and simple (direct) Monte Carlo interpretations of the theory. There is considerable scope to tailor the theory introduced in this paper to particular application settings (such as the big data setting), construct efficient parallelised schemes, understand the approximation and computational efficiencies of other such unification paradigms, and explore new theoretical and methodological directions.

Type
Research Papers
Copyright
© Applied Probability Trust 2019 

Access options

Get access to the full version of this content by using one of the access options below.

References

Agarwal, A. and Duchi, J. C. (2012). Distributed delayed stochastic optimization. In 51st IEEE Conference on Decision and Control, pp. 54515452.Google Scholar
Berger, O. J. (1980). Statistical Decision Theory and Bayesian Analysis. Springer, New York.10.1007/978-1-4757-1727-3CrossRefGoogle Scholar
Beskos, A. and Roberts, G. O. (2005). Exact simulation of diffusions. Ann. Appl. Prob. 15, 24222444.10.1214/105051605000000485CrossRefGoogle Scholar
Beskos, A., Papaspiliopoulos, O. and Roberts, G.O. (2006b), Retrospective exact simulation of diffusion sample paths with applications, Bernoulli 12, 10771098.10.3150/bj/1165269151CrossRefGoogle Scholar
Beskos, A., Papaspiliopoulos, O. and Roberts, G. O. (2008). A factorisation of diffusion measure and finite sample path constructions. Methodology Comput. Appl. Prob. 10, 85104.10.1007/s11009-007-9060-4CrossRefGoogle Scholar
Beskos, A., Papaspiliopoulos, O., Roberts, G. O. and Fearnhead, P. (2006a). Exact and computationally efficient likelihood-based estimation for discretely observed diffusion processes (with discussion). J. R. Statist. Soc. B 68, 333382.10.1111/j.1467-9868.2006.00552.xCrossRefGoogle Scholar
Chen, N. and Huang, Z. (2013). Localisation and exact simulation of Brownian motion-driven stochastic differential equations. Math. Operat. Res. 38, 591616.10.1287/moor.2013.0585CrossRefGoogle Scholar
Dacunha-Castelle, D. and Florens-Zmirou, D. (1986). Estimation of the coefficients of a diffusion from discrete observations. Stochastics 19, 263284.10.1080/17442508608833428CrossRefGoogle Scholar
Dai, H. (2014). Exact simulation for diffusion bridges: an adaptive approach. J. Appl. Prob. 51, 346358.10.1239/jap/1402578629CrossRefGoogle Scholar
Dai, H. (2017). A new rejection sampling method without using hat function. Bernoulli 23, 24342465.10.3150/16-BEJ814CrossRefGoogle Scholar
Dai, H., Pollock, M. and Roberts, G. (2019). Monte Carlo fusion. Supplementary material. Available at http://doi.org/10.1017/jpr.2019.12 http://doi.org/10.1017/jpr.2019.12.CrossRefGoogle Scholar
Fleiss, J. L. (1993). Statistical basis of meta-analysis. Statist. Methods Med. Res. 2, 121145.10.1177/096228029300200202CrossRefGoogle ScholarPubMed
Genest, C. and Zidek, J. V. (1986). Combining probability distributions: a critique and an annotated bibliography. Statist. Sci. 1, 114148.10.1214/ss/1177013825CrossRefGoogle Scholar
Hansen, N. R. (2003). Geometric ergodicity of discrete-time approximations to multivariate diffusions. Bernoulli 9, 725743.10.3150/bj/1066223276CrossRefGoogle Scholar
Li, C., Srivastava, S. and Dunson, D. B. (2017). Simple, scalable and accurate posterior interval estimation. Biometrika 104, 665680.CrossRefGoogle Scholar
Li, Y., Yang, M. and Zhang, Z. (2015). Multi-view representation learning: A survey from shallow methods to deep methods. J. Latex Class Files 14, 20pp.Google Scholar
Masuda, H. (2004). On multidimensional Ornstein-Uhlenbeck processes driven by a general Lévy process. Bernoulli 10, 97120.10.3150/bj/1077544605CrossRefGoogle Scholar
Minsker, S., Srivastava, S., Lin, L. and Dunson, D. B. (2014). Scalable and robust Bayesian inference via the median posterior. In Proc. 31st Internat. Conference on Machine Learning, pp. 16561664.Google Scholar
Neiswanger, W., Wang, C. and Xing, E. P. (2014). Asymptotically exact, embarrassingly parallel MCMC. In Proc. 13th Conference on Uncertainty In Artificial Intelligence, pp. 623632.Google Scholar
Pollock, M. (2013). Some Monte Carlo methods for jump diffusions. Doctoral Thesis, University of Warwick.Google Scholar
Pollock, M., Johansen, A. M. and Roberts, G. O. (2016b). On the exact and ε-strong simulation of (jump) diffusions. Bernoulli 22, 794856.10.3150/14-BEJ676CrossRefGoogle Scholar
Pollock, M, Fearnhead, P., Johansen, A. M. and Roberts, G. O. (2016a). The scalable Langevin exact algorithm: bayesian inference for big data. Submitted to J. R. Statist. Soc. B.Google Scholar
Scott, S. L. et al. (2016). Bayes and big data: the consensus Monte Carlo algorithm. Internat. J. Manag. Sci. Eng. Manag. 11, 7888.Google Scholar
Smith, T. C., Spiegelhalter, D. J. and Thomas, A. (1995). Bayesian approaches to random-effects meta-analysis: a comparative study Statist. Med. 14, 26852699.10.1002/sim.4780142408CrossRefGoogle ScholarPubMed
Srivastava, S., Cevher, V., Dinh, Q. and Dunson, D. (2016). WASP: scalable Bayes via barycenters of subset posteriors. In Proc. 18th Internat. Conference on Artificial Intelligence and Statistics, pp. 912920.Google Scholar
Stamatakis, A. and Aberer, A. J. (2013). Novel parallelization schemes for large-scale likelihood-based phylogenetic inference. In Proc. 2013 IEEE 27th Internat. Symposium on Parallel and Distributed Processing, pp. 1195-1204.10.1109/IPDPS.2013.70CrossRefGoogle Scholar
Tan, A., Doss, H. and Hobert, J. P. (2015). Honest importance sampling with multiple Markov chains. J. Comput. Graph. Statist. 24, 792826.10.1080/10618600.2014.929523CrossRefGoogle ScholarPubMed
Wang, X. and Dunson, D. B. (2014). Parallelizing MCMC via Weierstrass sampler. Preprint. Available at https://arxiv.org/abs/1312.4605v2 https://arxiv.org/abs/1312.4605v2.Google Scholar
Zhao, J., Xie, X., Xu, X. and Sun, S. (2017). Multi-view learning overview: recent progress and new challenges. Inf. Fusion 38, 4354.10.1016/j.inffus.2017.02.007CrossRefGoogle Scholar

Dai supplementary material

Supplementary material

PDF 239 KB

Altmetric attention score

Full text views

Full text views reflects PDF downloads, PDFs sent to Google Drive, Dropbox and Kindle and HTML full text views.

Total number of HTML views: 18
Total number of PDF views: 83 *
View data table for this chart

* Views captured on Cambridge Core between 12th July 2019 - 24th February 2021. This data will be updated every 24 hours.

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Monte Carlo fusion
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Monte Carlo fusion
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Monte Carlo fusion
Available formats
×
×

Reply to: Submit a response


Your details


Conflicting interests

Do you have any conflicting interests? *