Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Part One Machine Learning
- Part Two Optimal Recovery
- Executive Summary
- 9 Foundational Results of Optimal Recovery
- 10 Approximability Models
- 11 Ideal Selection of Observation Schemes
- 12 Curse of Dimensionality
- 13 Quasi-Monte Carlo Integration
- Part Three Compressive Sensing
- Part Four Optimization
- Part Five Neural Networks
- Appendices
- References
- Index
12 - Curse of Dimensionality
from Part Two - Optimal Recovery
Published online by Cambridge University Press: 21 April 2022
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- Part One Machine Learning
- Part Two Optimal Recovery
- Executive Summary
- 9 Foundational Results of Optimal Recovery
- 10 Approximability Models
- 11 Ideal Selection of Observation Schemes
- 12 Curse of Dimensionality
- 13 Quasi-Monte Carlo Integration
- Part Three Compressive Sensing
- Part Four Optimization
- Part Five Neural Networks
- Appendices
- References
- Index
Summary
This chapter introduces different notions of tractability and intractability. Most notably, the curse of dimensionality occurs when the solution of a multivariate problem would necessitate a number of observations growing exponentially with the number of variables involved. As illustrations, relying on reproducing-kernel techniques, it is shown on the one hand that the integration of trigonometric polynomials is an intractable problem, and on the other hand that integration in weighted Sobolev spaces is a tractable problem provided the weights decay fast enough.
- Type
- Chapter
- Information
- Mathematical Pictures at a Data Science Exhibition , pp. 94 - 101Publisher: Cambridge University PressPrint publication year: 2022