Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-qsmjn Total loading time: 0 Render date: 2024-04-19T03:39:05.793Z Has data issue: false hasContentIssue false

10 - Parallel Belief Propagation in Factor Graphs

from Part Two - Supervised and Unsupervised Learning Algorithms

Published online by Cambridge University Press:  05 February 2012

Joseph Gonzalez
Affiliation:
Carnegie Mellon University, Pittsburgh, PA, USA
Yucheng Low
Affiliation:
Carnegie Mellon University, Pittsburgh, PA, USA
Carlos Guestrin
Affiliation:
Carnegie Mellon University, Pittsburgh, PA, USA
Ron Bekkerman
Affiliation:
LinkedIn Corporation, Mountain View, California
Mikhail Bilenko
Affiliation:
Microsoft Research, Redmond, Washington
John Langford
Affiliation:
Yahoo! Research, New York
Get access

Summary

Probabilistic graphical models are used in a wide range of machine learning applications. From reasoning about protein interactions (Jaimovich et al., 2006) to stereo vision (Sun, Shum, and Zheng, 2002), graphical models have facilitated the application of probabilistic methods to challenging machine learning problems. A core operation in probabilistic graphical models is inference – the process of computing the probability of an event given particular observations. Although inference is NP-complete in general, there are several popular approximate inference algorithms that typically perform well in practice. Unfortunately, the approximate inference algorithms are still computationally intensive and therefore can benefit from parallelization. In this chapter, we parallelize loopy belief propagation (or loopy BP in short), which is used in a wide range of ML applications (Jaimovich et al., 2006; Sun et al., 2002; Lan et al., 2006; Baron, Sarvotham, and Baraniuk, 2010; Singla and Domingos, 2008).

We begin by briefly reviewing the sequential BP algorithm as well as the necessary background in probabilistic graphical models. We then present a collection of parallel shared memory BP algorithms that demonstrate the importance of scheduling in parallel BP. Next, we develop the Splash BP algorithm, which combines new scheduling ideas to address the limitations of existing sequential BP algorithms and achieve theoretically optimal parallel performance. Finally, we present how to efficiently implement loopy BP algorithms in the distributed parallel setting by addressing the challenges of distributed state and load balancing.

Type
Chapter
Information
Scaling up Machine Learning
Parallel and Distributed Approaches
, pp. 190 - 216
Publisher: Cambridge University Press
Print publication year: 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Adve, S. V., and Gharachorloo, K. 1996. Shared Memory ConsistencyModels: A Tutorial. Computer, 29(12), 66–76.CrossRefGoogle Scholar
Baron, D., Sarvotham, S., and Baraniuk, R. G. 2010. Bayesian compressive sensing via belief propagation. IEEE Transactions on Signal Processing, 58(1), 269–280.CrossRefGoogle Scholar
Bertsekas, D. P., and Tsitsiklis, J. N. 1989. Parallel and Distributed Computation: Numerical Methods. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
Cooper, G. F. 1990. The Computational Complexity of Probabilistic Inference using Bayesian Belief Networks. Artificial Intelligence, 42, 393–405.CrossRefGoogle Scholar
Crupi, V. A., Das, S. K., and Pinotti, M. C. 1996. Parallel and Distributed Meldable Priority Queues Based on Binomial Heaps. In: International Conference on, Parallel Processing, Vol. 1. IEEE Computer Society.Google Scholar
Darwiche, A., Dechter, R., Choi, A., Gogate, V., and Otten, L. 2008. UAI'08 Workshop: Evaluating and Disseminating Probabilistic Reasoning Systems. http://graphmod.ics.uci.edu/uai08/.
Dean, J., and Ghemawat, S. 2008. MapReduce: Simplified Data Processing on Large Clusters. Communications of the ACM, 51(1), 107–113.CrossRefGoogle Scholar
Domingos, P., Kok, S., Lowd, D., Poon, H. F., Richardson, M., Singla, P., Sumner, M., and Wang, J. 2008. Markov Logic: A Unifying Language for Structural and Statistical Pattern Recognition. Page 3 of: SSPR.Google Scholar
Driscoll, J. R., Gabow, H. N., Shrairman, R., and Tarjan, R. E. 1988. Relaxed Heaps: An Alternative to Fibonacci Heaps with Applications to Parallel Computation. Communications of the ACM, 31, 1343–1354.CrossRefGoogle Scholar
Elidan, G., McGraw, I., and Koller, D. 2006. Residual Belief Propagation: Informed Scheduling for Asynchronous Message Passing. In: UAI' 06.Google Scholar
Gonzalez, J., Low, Y., Guestrin, C., and O'Hallaron, D. 2009a (July). Distributed Parallel Inference on Large Factor Graphs. In: UAI'09.Google Scholar
Gonzalez, J., Low, Y., and Guestrin, C. 2009b. Residual Splash for Optimally Parallelizing Belief Propagation. In: AISTATS'09.Google Scholar
Hendrickson, B., and Leland, R. 1994, Oct. The Chaco User's Guide, Version 2.0. Technical Report SAND94-2692. Sandia National Labs, Albuquerque, NM.
Huang, J., Chavira, M., and Darwiche, A. 2006. Solving MAP Exactly by Searching on Compiled Arithmetic Circuits. In: AAAI' 06.Google Scholar
Ihler, A. T. III, Fischer, J. W., and Willsky, A. S. 2005. Loopy Belief Propagation: Convergence and Effects of Message Errors. Journal of Machine Learning Research, 6, 905–936.Google Scholar
Jaimovich, A., Elidan, G., Margalit, H., and Friedman, N. 2006. Towards an Integrated Protein-Protein Interaction Network: A RelationalMarkov Network Approach. Journal of Computational Biology, 13(2), 145–164.CrossRefGoogle ScholarPubMed
Karypis, G., and Kumar, V. 1998. Multilevel k-way Partitioning Scheme for Irregular Graphs. Journal of Parallel Distributed Computing, 48(1).CrossRefGoogle Scholar
Koller, D., and Friedman, N. 2009. Probabilistic Graphical Models. Cambridge, MA: MIT Press.Google Scholar
Lan, X. Y., Roth, S., Huttenlocher, D. P., and Black, M. J. 2006. Efficient Belief Propagation with Learned Higher-Order Markov Random Fields. In: ECCV' 06.Google Scholar
Mendiburu, A., Santana, R., Lozano, J. A., and Bengoetxea, E. 2007. A Parallel Framework for Loopy Belief Propagation. In: GECCO' 07: Proceedings of the 2007 GECCO Conference Companion on Genetic and Evolutionary Computation.CrossRefGoogle Scholar
Mooij, J. M., and Kappen, H. J. 2007. Sufficient Conditions for Convergence of the Sum-Product Algorithm. ITIT, 4422–4437.Google Scholar
Parberry, I. 1995. Load Sharing with Parallel Priority Queues. Journal of Computer and System Sciences, 50(1), 64–73.CrossRefGoogle Scholar
Pearl, J. 1988. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. San Francisco: Morgan Kaufmann.Google Scholar
Ranganathan, A., Kaess, M., and Dellaert, F. 2007. Loopy SAM. In: IJCAI' 07.Google Scholar
Roth, D. 1993. On the Hardness of Approximate Reasoning. Pages 613–618 of: IJCAI' 93.Google Scholar
Sanders, P. 1998. Randomized Priority Queues for Fast Parallel Access. Journal of Parallel and Distributed Computing, 49(1), 86–97.CrossRefGoogle Scholar
Saxena, A., Chung, S. H., and Ng, A. Y. 2007. 3-D Depth Reconstruction from a Single Still Image. International Journal of Computer Vision, 76(1): 53–69.CrossRefGoogle Scholar
Singla, P., and Domingos, P. 2008. Lifted First-Order Belief Propagation. In: AAAI' 08.Google Scholar
Sun, J., Shum, H. Y., and Zheng, N. N. 2002. Stereo Matching using Belief Propagation. In: ECCV' 02.Google Scholar
Tatikonda, S., and Jordan, M. I. 2002. Loopy Belief Propogation and Gibbs Measures. In: UAI' 02.Google Scholar
Wainwright, M., Jaakkola, T., and Willsky, A. S. 2001. Tree-Based Reparameterization for Approximate Estimation on Graphs with Cycles. In: NIPS.Google Scholar
Yanover, C., and Weiss, Y. 2002. Approximate Inference and Protein Folding. Pages 84–86 of: NIPS.Google Scholar
Yanover, C., Schueler-Furman, O., and Weiss, Y. 2007. Minimizing and Learning Energy Functions for Side-Chain Prediction. Journal of Computational Biology, 381–395.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×