Book contents
- Frontmatter
- Contents
- List of Figures
- List of Tables
- Preface
- Part I Fundamentals of Bayesian Inference
- 1 Introduction
- 2 Basic Concepts of Probability and Inference
- 3 Posterior Distributions and Inference
- 4 Prior Distributions
- Part II Simulation
- Part III Applications
- A Probability Distributions and Matrix Theorems
- B Computer Programs for MCMC Calculations
- Bibliography
- Author Index
- Subject Index
3 - Posterior Distributions and Inference
from Part I - Fundamentals of Bayesian Inference
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- List of Figures
- List of Tables
- Preface
- Part I Fundamentals of Bayesian Inference
- 1 Introduction
- 2 Basic Concepts of Probability and Inference
- 3 Posterior Distributions and Inference
- 4 Prior Distributions
- Part II Simulation
- Part III Applications
- A Probability Distributions and Matrix Theorems
- B Computer Programs for MCMC Calculations
- Bibliography
- Author Index
- Subject Index
Summary
The first section of this chapter discusses general properties of posterior distributions. It continues with an explanation of how a Bayesian statistician uses the posterior distribution to conduct statistical inference, which is concerned with learning about parameter values either in the form of point or interval estimates, making predictions, and comparing alternative models.
Properties of Posterior Distributions
In this section, we discuss general properties of posterior distributions, starting with the choice of the likelihood function. We continue by generalizing the concept to include models with more than one parameter and go on to discuss the revision of posterior distributions as more data become available, the role of the sample size, and the concept of identification.
The Likelihood Function
As we have seen, the posterior distribution is proportional to the product of the likelihood function and the prior distribution. The latter is somewhat controversial and is discussed in Chapter 4, but the choice of a likelihood function is also an important matter and requires discussion. A central issue is that the Bayesian must specify an explicit likelihood function to derive the posterior distribution. In some cases, the choice of a likelihood function appears straightforward. In the coin-tossing experiment of Section 2.2, for example, the choice of a Bernoulli distribution seems natural, but it does require the assumptions of independent trials and a constant probability.
- Type
- Chapter
- Information
- Introduction to Bayesian Econometrics , pp. 20 - 40Publisher: Cambridge University PressPrint publication year: 2007