![](https://assets.cambridge.org/97805217/66982/cover/9780521766982.jpg)
Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
10 - Side-information problems
Published online by Cambridge University Press: 05 August 2014
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
Summary
Classical information theory deals with point-to-point communication, where a single source is transmitted over a channel to a single destination. In a distributed scenario, there may be more than one source or more than one channel and destination. The simplest setup, which captures much of the essence of the problem, is that of sources and channels with side information. The idea of coding with side information appeared for the first time in the seminal work of Slepian and Wolf from 1973 [247]. Let us illustrate this idea with a couple of examples.
Predictive coding of temperature
Suppose I wish to communicate tomorrow's temperature to my friend, after hearing the weather forecast. If the relevant range is 21–28 ºC, then I need three bits of information to tell her my number. But suppose that in this season of the year the temperature changes daily by exactly one degree. If today's temperature is known to both of us, then clearly one bit of information is sufficient: “0” means −1 ºC and “1” means +1 ºC. But what if today's temperature is known only to my friend?
In the general case, X and Y are two correlated memoryless sources, where X is known to the encoder and Y is known to the decoder.
- Type
- Chapter
- Information
- Lattice Coding for Signals and NetworksA Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory, pp. 247 - 294Publisher: Cambridge University PressPrint publication year: 2014