![](https://assets.cambridge.org/97805217/66982/cover/9780521766982.jpg)
Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
5 - Entropy-coded quantization
Published online by Cambridge University Press: 05 August 2014
- Frontmatter
- Dedication
- Contents
- Preface
- Acknowledgements
- List of notation
- 1 Introduction
- 2 Lattices
- 3 Figures of merit
- 4 Dithering and estimation
- 5 Entropy-coded quantization
- 6 Infinite constellation for modulation
- 7 Asymptotic goodness
- 8 Nested lattices
- 9 Lattice shaping
- 10 Side-information problems
- 11 Modulo-lattice modulation
- 12 Gaussian networks
- 13 Error exponents
- Appendix
- References
- Index
Summary
The elements of dithering and estimation provide tools to control the distribution of the quantization error, and to compute the average distortion. Another important parameter of source coding is the coding rate.
In this chapter we focus on the quantizer entropy as a measure for the coding rate. Although the lattice is unbounded, entropy coding keeps its coding rate finite. We examine the entropy-distortion trade-off for a general source and lattice quantizer in Sections 5.2–5.4, and compare it to Shannon's rate-distortion function R(D) – the ultimate compression rate of any system achieving a distortion level D – in Sections 5.5–5.6. As we shall see, the redundancy above R(D) is small for all sources; even for a simple scalar lattice quantizer it is at most ≈3/4 bit, and only ≈1/4 bit at high-resolution quantization. Furthermore, if we combine a Wiener filter at the quantizer output (as we did in Section 4.5), then the redundancy of a scalar ECDQ for a Gaussian source is at most ≈1/4 bit at any resolution.
The Shannon entropy
In fixed-rate lossless coding, all elements of the data are mapped into binary codewords of identical length. Let A denote the data alphabet. Since there are 2l binary words of length l, the codeword length must be at least the base-2 logarithm of the size of A, rounded up to the nearest integer.
- Type
- Chapter
- Information
- Lattice Coding for Signals and NetworksA Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory, pp. 84 - 109Publisher: Cambridge University PressPrint publication year: 2014