The truth is rarely pure and never simple (Oscar Wilde, The Importance of Being Ernest, 1854–1900).
In the Factor Analysis model X = AF + μ + ε;, an essential aim is to find an expression for the unknown d × k matrix of factor loadings A. Of secondary interest is the estimation of F. If X comes from a Gaussian distribution, then the principal component (PC) solution for A and F results in independent scores, but this luxury is lost in the PC solution of non-Gaussian random vectors and data. Surprisingly, it is not the search for a generalisation of Factor Analysis, but the departure from Gaussianity that has paved the way for new developments.
In psychology, for example, scores in mathematics, language and literature or comprehensive tests are used to describe a person's intelligence. A Factor Analysis approach aims to find the underlying or hidden kinds of intelligence from the test scores, typically under the assumption that the data come from the Gaussian distribution. Independent Component Analysis, too, strives to find these hidden quantities, but under the assumption that the data are non-Gaussian. This assumption precludes the use of the Gaussian likelihood, and the independent component (IC) solution will differ from the maximum-likelihood (ML) Factor Analysis solution, which may not be appropriate for non-Gaussian data.
To get some insight into the type of solution one hopes to obtain with Independent Component Analysis, consider, for example, the superposition of sound tracks.