Book contents
- Frontmatter
- Dedication
- Contents
- List of Algorithms
- Notation
- Preface
- I Classical Methods
- II Factors and Groupings
- III Non-Gaussian Analysis
- 9 Towards Non-Gaussianity
- 10 Independent Component Analysis
- 11 Projection Pursuit
- 12 Kernel and More Independent Component Methods
- 13 Feature Selection and Principal Component Analysis Revisited
- Problems for Part III
- References
- Author Index
- Subject Index
- Data Index
9 - Towards Non-Gaussianity
from III - Non-Gaussian Analysis
Published online by Cambridge University Press: 05 June 2014
- Frontmatter
- Dedication
- Contents
- List of Algorithms
- Notation
- Preface
- I Classical Methods
- II Factors and Groupings
- III Non-Gaussian Analysis
- 9 Towards Non-Gaussianity
- 10 Independent Component Analysis
- 11 Projection Pursuit
- 12 Kernel and More Independent Component Methods
- 13 Feature Selection and Principal Component Analysis Revisited
- Problems for Part III
- References
- Author Index
- Subject Index
- Data Index
Summary
Man denkt an das, was man verließ; was man gewohnt war, bleibt ein Paradies (Johann Wolfgang von Goethe, Faust II, 1749–1832). We think of what we left behind; what we are familiar with remains a paradise.
Introduction
Gaussian random vectors are special: uncorrelated Gaussian vectors are independent. The difference between independence and uncorrelatedness is subtle and is related to the deviation of the distribution of the random vectors from the Gaussian distribution.
In Principal Component Analysis and Factor Analysis, the variability in the data drives the search for low-dimensional projections. In the next three chapters the search for direction vectors focuses on independence and deviations from Gaussianity of the low-dimensional projections:
• Independent Component Analysis in Chapter 10 explores the close relationship between independence and non-Gaussianity and finds directions which are as independent and as non-Gaussian as possible;
• Projection Pursuit in Chapter 11 ignores independence and focuses more specifically on directions that deviate most from the Gaussian distribution.
• the methods of Chapter 12 attempt to find characterisations of independence and integrate these properties in the low-dimensional direction vectors.
As in Parts I and II, the introductory chapter in this final part collects and summarises ideas and results that we require in the following chapters.
We begin with a visual comparison of Gaussian and non-Gaussian data.
- Type
- Chapter
- Information
- Analysis of Multivariate and High-Dimensional Data , pp. 295 - 304Publisher: Cambridge University PressPrint publication year: 2013