Entropy is a subject which has played a central role in a number of areas such as statistical mechanics and information theory. The connections between the various applications of entropy have become clearer in recent years by the introduction of probability theory into its foundations. It is now possible to see a number of what were previously isolated results in various disciplines as part of a more general mathematical theory of entropy.
This volume presents a self-contained exposition of the mathematical theory of entropy. Those parts of probability theory which are necessary for an understanding of the central topics concerning entropy have been included. In addition, carefully chosen examples are given in order that the reader may omit proofs of some of the theorems and yet by studying these examples and discussion obtain insight into the theorems.
The last four chapters give a description of those parts of information theory, ergodic theory, statistical mechanics, and topological dynamics which are most affected by entropy. These chapters may be read independently of each other. The examples show how ideas originating in one area have influenced other areas. Chapter III contains a brief description of how entropy as a measure of information flow has affected information theory and complements the first part of The Theory of Information and Coding by R. J. McEliece (volume 3 of this ENCYCLOPEDIA). Recent applications of entropy to statistical mechanics and topological dynamics are given in chapters V and VI.