In the words of Hungarian mathematician Alfréd Rényi, ‘the mathematical theory of information came into being when it was realised that the flow of information can be expressed numerically in the same way as distance, time, mass, temperature …’
In this chapter, we are interested in the dynamics of the information flow in a random network. To make precise statements about this, we first need to introduce some information-theoretic concepts to clarify – from a mathematical perspective – the notion of information itself and that of communication rate. We shall see that the communication rate between pairs of nodes in the network depends on their (random) positions and on their transmission strategies. We consider two scenarios: in the first one, only two nodes wish to communicate and all the others help by relaying information; in the second case, different pairs of nodes wish to communicate simultaneously. We compute upper and lower bounds on achievable rates in the two cases, by exploiting some structural properties of random graphs that we have studied earlier. We take a statistical physics approach, in the sense that we derive scaling limits of achievable rates for large network sizes.
The topics of this section only scratch the surface of what is a large field of study; we only discuss those topics that are needed for our purposes. The interested reader may consult specific information-theory textbooks, such as McEliece (2004), and Cover and Thomas (2006), for a more in-depth study.
The act of communication can be interpreted as altering the state of the receiver due to a corresponding action of the transmitter.