Book contents
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- 1 Introduction
- 2 Roadmap to the book
- 3 Mathematical context and background
- 4 Continuous-domain innovation models
- 5 Operators and their inverses
- 6 Splines and wavelets
- 7 Sparse stochastic processes
- 8 Sparse representations
- 9 Infinite divisibility and transform-domain statistics
- 10 Recovery of sparse signals
- 11 Wavelet-domain methods
- 12 Conclusion
- Appendix A Singular integrals
- Appendix B Positive definiteness
- Appendix C Special functions
- References
- Index
1 - Introduction
Published online by Cambridge University Press: 05 September 2014
- Frontmatter
- Dedication
- Contents
- Preface
- Notation
- 1 Introduction
- 2 Roadmap to the book
- 3 Mathematical context and background
- 4 Continuous-domain innovation models
- 5 Operators and their inverses
- 6 Splines and wavelets
- 7 Sparse stochastic processes
- 8 Sparse representations
- 9 Infinite divisibility and transform-domain statistics
- 10 Recovery of sparse signals
- 11 Wavelet-domain methods
- 12 Conclusion
- Appendix A Singular integrals
- Appendix B Positive definiteness
- Appendix C Special functions
- References
- Index
Summary
Sparsity: Occam's razor of modern signal processing?
The hypotheses of Gaussianity and stationarity play a central role in the standard statistical formulation of signal processing. They fully justify the use of the Fourier transform as the optimal signal representation and naturally lead to the derivation of optimal linear filtering algorithms for a large variety of statistical estimation tasks. This classical view of signal processing is elegant and reassuring, but it is not at the forefront of research anymore.
Starting with the discovery of the wavelet transform in the late 1980s [Dau88, Mal89], researchers in signal processing have progressively moved away from the Fourier transform and have uncovered powerful alternatives. Consequently, they have ceased modeling signals as Gaussian stationary processes and have adopted a more deterministic, approximation-theoretic point of view. The key developments that are presently reshaping the field, and which are central to the theory presented in this book, are summarized below.
• Novel transforms and dictionaries for the representation of signals. New redundant and non-redundant representations of signals (wavelets, local cosine, curvelets) have emerged since the mid 1990s and have led to better algorithms for data compression, data processing, and feature extraction. The most prominent example is the wavelet based JPEG-2000 standard for image compression [CSE00], which outperforms the widely-used JPEG method based on the DCT (discrete cosine transform). Another illustration is wavelet-domain image denoising, which provides a good alternative to more traditional linear filtering [Don95]. The various dictionaries of basis functions that have been proposed so far are tailored to specific types of signals; there does not appear to be one that fits all.
- Type
- Chapter
- Information
- An Introduction to Sparse Stochastic Processes , pp. 1 - 18Publisher: Cambridge University PressPrint publication year: 2014