This text is based on a course of the same title given at Cambridge for a number of years. It consists of an introduction to information theory and to coding theory at a level appropriate to mathematics undergraduates in their second or later years. Prerequisites needed are a knowledge of discrete probability theory and no more than an acquaintance with continuous probability distributions (including the normal). What is needed in finite-field theory is developed in the course of the text, but some knowledge of group theory and vector spaces is taken for granted.
The two topics treated are traditionally put into mathematical pigeon-holes remote from each other. They do however fit well together in a course, in addressing from different standpoints the same problem, that of communication through noisy channels. The authors hope that undergraduates who have liked algebra courses, or probability courses, will enjoy the otherhalf of the book also, and will feel at the end that their knowledge of how it all fits together is greater than the sum of its parts.
The Cambridge course was invented by Peter Whittle and the debt that particularly the information-theoretic part of the book owes him is unrepayable. Certain features that distinguish the present approach from that found elsewhere are due to him, in particular the conceptual ‘decoupling’ of source and channel, and the definition of channel capacity as a maximized rate of reliable transmission. The usual definition of channel capacity is, from that standpoint, an evaluation,less fundamental than the definition.
In detail, the first four chapters cover the information-theory part of the course. The first, on noiseless coding, also introduces entropy, for use throughout the text. Chapter 2 deals with information sources and gives a careful treatment of the evaluation of rate of information output. Chapters 3 and 4 deal with channels and random coding. An initial approach to the evaluation of channel capacity is taken in Chapter 3 that is not quite sharp, and so yields only bounds, but which seems considerably more direct and illuminating than the usual approach through mutual information. The latter route is taken in Chapter 4, where several channel capacities are exactly calculated.