Book contents
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
10 - Most Likely Instantiations
Published online by Cambridge University Press: 23 February 2011
- Frontmatter
- Contents
- Preface
- 1 Introduction
- 2 Propositional Logic
- 3 Probability Calculus
- 4 Bayesian Networks
- 5 Building Bayesian Networks
- 6 Inference by Variable Elimination
- 7 Inference by Factor Elimination
- 8 Inference by Conditioning
- 9 Models for Graph Decomposition
- 10 Most Likely Instantiations
- 11 The Complexity of Probabilistic Inference
- 12 Compiling Bayesian Networks
- 13 Inference with Local Structure
- 14 Approximate Inference by Belief Propagation
- 15 Approximate Inference by Stochastic Sampling
- 16 Sensitivity Analysis
- 17 Learning: The Maximum Likelihood Approach
- 18 Learning: The Bayesian Approach
- A Notation
- B Concepts from Information Theory
- C Fixed Point Iterative Methods
- D Constrained Optimization
- Bibliography
- Index
Summary
We consider in this chapter the problem of finding variable instantiations that have maximal probability under some given evidence. We present two classes of exact algorithms for this problem, one based on variable elimination and the other based on systematic search. We also present approximate algorithms based on local search.
Introduction
Consider the Bayesian network in Figure 10.1, which concerns a population that is 55% male and 45% female. According to this network, members of this population can suffer from a medical condition C that is more likely to occur in males. Moreover, two diagnostic tests are available for detecting this condition, T1 and T2, with the second test being more effective on females. The CPTs of this network also reveal that the two tests are equally effective on males.
One can partition the members of this population into four different groups depending on whether they are male or female and whether they have the condition or not. Suppose that a person takes both tests and all we know is that the two tests yield the same result, leading to the evidence A=yes. We may then ask: What is the most likely group to which this individual belongs? This query is therefore asking for the most likely instantiation of variables S and C given evidence A=yes, which is technically known as a MAP instantiation. We have already discussed this class of queries in Chapter 5, where we referred to variables S and C as the MAP variables.
- Type
- Chapter
- Information
- Modeling and Reasoning with Bayesian Networks , pp. 243 - 269Publisher: Cambridge University PressPrint publication year: 2009