Book contents
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
Prior Probabilities Revisited
Published online by Cambridge University Press: 04 May 2010
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
Summary
ABSTRACT
Unknown prior probabilities can be treated as intervening variables in the determination of a posterior distribution. In essence this involves determining the minimally informative information system with a given likelihood matrix.
Some of the consequences of this approach are non-intuitive. In particular, the computed prior is not invariant for different sample sizes in random sampling with unknown prior.
GENERALITIES
The role of prior probabilities in inductive inference has been a lively issue since the posthumous publication of the works of Thomas Bayes at the close of the 18th century. Attitudes on the topic have ranged all the way from complete rejection of the notion of prior probabilities (Fisher, 1949) to an insistence by contemporary Bayesians that they are essential (de Finetti, 1975). A careful examination of some of the basics is contained in a seminal paper by E.T. Jaynes, the title of which in part suggested the title of the present essay (Jaynes, 1968).
The theorem of Bayes, around which the controversy swirls, is itself non-controversial. It is, in fact, hardly more than a statement of the law of the product for probabilities, plus the commutativity of the logical product. Equally straightforward is the fact that situations can be found for which representation by Bayes theorem is unassailable. The classic classroom two-urn experiment is neatly tailored for this purpose. Thus, the issue is not so much a conceptual one, involving the “epistemological status of prior probabilities, as it is a practical One.
- Type
- Chapter
- Information
- Maximum Entropy and Bayesian Methods in Applied StatisticsProceedings of the Fourth Maximum Entropy Workshop University of Calgary, 1984, pp. 117 - 130Publisher: Cambridge University PressPrint publication year: 1986
- 3
- Cited by