Skip to main content Accessibility help
×
Home
  • Print publication year: 1984
  • Online publication date: June 2013

Chapter 2 - Entropy and Information

Summary

In this chapter we shall give a mathematical definition of the information in a random event and the entropy of experiments with a countable number of outcomes. We shall also indicate how the entropy is a measure of uncertainty and then give the main properties satisfied by both information and entropy. Then the definition of entropy will be extended to include experiments with an arbitrary number of outcomes, and the properties of entropy will be proven for this case. Finally we give the definitions of the rate of information generation and the entropy of a dynamical system and derive their most important properties. We conclude with several examples and a brief discussion of two useful extensions of these definitions.

Information and Uncertainty of Events

Let (Ω, F, P) be a Lebesgue space and E an event in F. Thinking of the Lebesgue space as being a mathematical model of some random experiment, suppose an outcome of this experiment results in the event E. We have gained some information because we know that E occurred. The purpose of this first section is to define a function I on the events in a Lebesgue space so that I(E) will give a quantitative measure of the information gained if the event E results from the outcome of the experiment.

Before the experiment is performed, the uncertainty of its outcome resulting in the event E should equal the information we have gained if the outcome does result in E.