Book contents
- Frontmatter
- Contents
- List of contributors
- Prologue
- Preface and guide to the reader
- Acknowledgements
- Part I Background
- Part II Generalized approaches to quantum error correction
- Part III Advanced quantum codes
- 9 Quantum convolutional codes
- 10 Nonadditive quantum codes
- 11 Iterative quantum coding systems
- 12 Algebraic quantum coding theory
- 13 Optimization-based quantum error correction
- Part IV Advanced dynamical decoupling
- Part V Alternative quantum computation approaches
- Part VI Topological methods
- Part VII Applications and implementations
- Part VIII Critical evaluation of fault tolerance
- References
- Index
13 - Optimization-based quantum error correction
from Part III - Advanced quantum codes
Published online by Cambridge University Press: 05 September 2013
- Frontmatter
- Contents
- List of contributors
- Prologue
- Preface and guide to the reader
- Acknowledgements
- Part I Background
- Part II Generalized approaches to quantum error correction
- Part III Advanced quantum codes
- 9 Quantum convolutional codes
- 10 Nonadditive quantum codes
- 11 Iterative quantum coding systems
- 12 Algebraic quantum coding theory
- 13 Optimization-based quantum error correction
- Part IV Advanced dynamical decoupling
- Part V Alternative quantum computation approaches
- Part VI Topological methods
- Part VII Applications and implementations
- Part VIII Critical evaluation of fault tolerance
- References
- Index
Summary
The purpose of quantum error correction (QEC) is to preserve a quantum state despite the presence of decoherence. As we see throughout this book, the desired robustness exacts a cost in resources, most commonly the inclusion of redundant qubits. The following are reasonable engineering queries: How much will it cost to provide the desired robustness? For a fixed cost, what is the best performance I can achieve? In this chapter, we present some numerical tools to illuminate these kinds of questions.
To understand the cost/performance trade-off quantitatively, we need a clear measure of performance and a model for permissible operations. With this in mind, we will revisit the concepts of fidelity and quantum operations. As it turns out, this quantitative approach can yield very well structured convex optimization problems. Using powerful numerical tools, we determine optimal encodings and recovery operations. Even if the optimal results are not directly implemented, the optimization tools can provide insight into practical solutions by providing the ultimate performance limits.
Limitation of the independent arbitrary errors model
As pointed out in Chapter 2, our rich history of classical error correction has provided a significant foundation for QEC methods. As such, the initial QEC breakthroughs involved importing classical coding concepts into a framework that made robust quantum codes (such as CSS codes or the stabilizer code formalism). We learned that we could create general purpose codes that made minimal assumptions about the structure of the decoherence process.
- Type
- Chapter
- Information
- Quantum Error Correction , pp. 327 - 348Publisher: Cambridge University PressPrint publication year: 2013