Book contents
- Frontmatter
- Contents
- Preface
- Notation
- Commonly used abbreviations
- 1 Channels, codes and capacity
- 2 Low-density parity-check codes
- 3 Low-density parity-check codes: properties and constructions
- 4 Convolutional codes
- 5 Turbo codes
- 6 Serial concatenation and RA codes
- 7 Density evolution and EXIT charts
- 8 Error floor analysis
- References
- Index
Preface
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Preface
- Notation
- Commonly used abbreviations
- 1 Channels, codes and capacity
- 2 Low-density parity-check codes
- 3 Low-density parity-check codes: properties and constructions
- 4 Convolutional codes
- 5 Turbo codes
- 6 Serial concatenation and RA codes
- 7 Density evolution and EXIT charts
- 8 Error floor analysis
- References
- Index
Summary
The field of error correction coding was launched with Shannon's revolutionary 1948 work showing – quite counter to intuition – that it is possible to transmit digital data with arbitrarily high reliability over noise-corrupted channels, provided that the rate of transmission is below the capacity of the channel. The mechanism for achieving this reliable communication is to encode a digital message with an error correction code prior to transmission and apply a decoding algorithm at the receiver.
Classical block and convolutional error correction codes were described soon afterwards and the first iterative codes were published by Gallager in his 1962 thesis; however they received little attention until the late 1990s. In the meantime, the highly structured algebraic codes introduced by Hamming, Elias, Reed, Muller, Solomon and Golay among others dominated the field. Despite the enormous practical success of these classical codes, their performance fell well short of the theoretically achievable performances set down by Shannon in his seminal 1948 paper. By the late 1980s, despite decades of attempts, researchers were largely resigned to this seemingly insurmountable theory–practice gap.
The relative quiescence of the coding field was utterly transformed by the introduction of “turbo codes”, proposed by Berrou, Glavieux and Thitimajshima in 1993, wherein all the key ingredients of successful error correction codes were replaced: turbo codes involve very little algebra, employ iterative, distributed, algorithms and focus on average (rather than worst-case) performance. This was a ground-shifting paper, forcing coding theorists to revise strongly held beliefs.
- Type
- Chapter
- Information
- Iterative Error CorrectionTurbo, Low-Density Parity-Check and Repeat-Accumulate Codes, pp. xi - xivPublisher: Cambridge University PressPrint publication year: 2009