Book contents
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
The Probability of a Probability
Published online by Cambridge University Press: 04 May 2010
- Frontmatter
- Contents
- Preface
- Donors
- Bayesian Methods: General Background
- Monkeys, Kangaroos, and N
- The Theory and Practice of the Maximum Entropy Formalism
- Bayesian Non-Parametric Statistics
- Generalized Entropies and the Maximum Entropy Principle
- The Probability of a Probability
- Prior Probabilities Revisited
- Band Extensions, Maximum Entropy and the Permanence Principle
- Theory of Maximum Entropy Image Reconstruction
- The Cambridge Maximum Entropy Algorithm
- Maximum Entropy and the Moments Problem: Spectroscopic Applications
- Maximum-Entropy Spectrum from a Non-Extendable Autocorrelation Function
- Multichannel Maximum Entropy Spectral Analysis Using Least Squares Modelling
- Multichannel Relative-Entropy Spectrum Analysis
- Maximum Entropy and the Earth's Density
- Entropy and Some Inverse Problems in Exploration Seismology
- Principle of Maximum Entropy and Inverse Scattering Problems
- Index
Summary
ABSTRACT
MAXENT (MAXimum ENTropy principle) is a general method of statistical inference derived from and intrinsic to statistical mechanics. The probabilities it produces are “logical probabilities” – measures of the logical relationship between hypothesis and evidence. We consider the significance and applications of the “logical probability” of such probabilities. The probability of a “logical probability” is shown to be the probability of the evidence used for the “logical probability”. This suggests a hierarchy of logics, with “evidences” defined as sets of probabilities on the preceding “logic”. Applications to reliability theory are described. We also clarify the meaning of MAXENT and examine arguments in a recent article in which temperature fluctuations are introduced in thermal physics.
INTRODUCTION
A method fundamental to statistical physics is the maximization of entropy. In recent years, this method has been recognized as a general procedure for statistical inference based on the fact that “entropy” is essentially a measure of information uncertainty [1]. The probabilities one obtains using MAXENT (as the “Maximum Entropy Principle” is now called) have a natural interpretation which has not been generally recognized, even by advocates of the procedure. This is the “degree of belief” (DOB) interpretation [2] – that “probability” is a measure of the logical relationship between two propositions: p(H | E) expresses a (normalized) “degree of belief” (DOB) in the relationship of hypothesis H to evidence E. Indeed, MAXENT asserts precisely the (statistical) consequences of assumed evidence since it is based on the idea that one should choose as probability one which maximizes “uncertainty” consistent with the evidence.
- Type
- Chapter
- Information
- Maximum Entropy and Bayesian Methods in Applied StatisticsProceedings of the Fourth Maximum Entropy Workshop University of Calgary, 1984, pp. 101 - 116Publisher: Cambridge University PressPrint publication year: 1986
- 3
- Cited by