Book contents
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
21 - Pseudorandom constructions: Expanders and extractors
from PART THREE - ADVANCED TOPICS
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- About this book
- Acknowledgments
- Introduction
- 0 Notational conventions
- PART ONE BASIC COMPLEXITY CLASSES
- PART TWO LOWER BOUNDS FOR CONCRETE COMPUTATIONAL MODELS
- PART THREE ADVANCED TOPICS
- 17 Complexity of counting
- 18 Average case complexity: Levin's theory
- 19 Hardness amplification and error-correcting codes
- 20 Derandomization
- 21 Pseudorandom constructions: Expanders and extractors
- 22 Proofs of PCP theorems and the Fourier transform technique
- 23 Why are circuit lower bounds so difficult?
- Appendix: Mathematical background
- Hints and selected exercises
- Main theorems and definitions
- Bibliography
- Index
- Complexity class index
Summary
How difficult could it be to find hay in a haystack?
–Howard KarloffThe probabilistic method is a powerful method to show the existence of objects (e.g., graphs, functions) with certain desirable properties. We have already seen it used in Chapter 6 to show the existence of functions with high-circuit complexity, in Chapter 19 to show the existence of good error-correcting codes, and in several other places in this book. But sometimes the mere existence of an object is not enough: We need an explicit and efficient construction. This chapter provides such constructions for two well-known (and related) families of pseudorandom objects, expanders and extractors. They are important in computer science because they can often be used to replace (or reduce) the amount of randomness needed in certain settings. This is reminiscent of derandomization, the topic of Chapter 20, and indeed we will see several connections to derandomization throughout the chapter. However, a big difference between Chapter 20 and this one is that all results proven here are unconditional, in other words do not rely on unproven assumptions. Another topic that is related to expanders is constructions of error-correcting codes and related hardness-amplification results, which we saw in Chapter 19. For a brief discussion of the many deep and fascinating connections between codes, expanders, pseudorandom generators, and extractors, see the chapter notes.
Expanders are graphs whose connectivity properties (how many edges lie between every two sets A, B of vertices) are similar to those of “random” graphs–in this sense they are “pseudorandom” or “like random.”
- Type
- Chapter
- Information
- Computational ComplexityA Modern Approach, pp. 421 - 459Publisher: Cambridge University PressPrint publication year: 2009