Principal Components are probably the best known and most widely used of all multivariate analysis techniques. The essential idea consists in performing a linear transformation of the observed k-dimensional variables in such a way that the new variables are vectors of k mutually orthogonal (uncorrelated) components – the principal components – ranked by decreasing variances. In case the original variables were strongly interrelated, the few first principal components, typically, will account for most of the variation in the original data. Restricting to those few components then allows for a reduction in the dimension of the dataset that retains most of the variability of the original one. Karl. Pearson is generally credited for introducing the method in 1901. The same method was rediscovered in 1933 and popularized by Hotelling; ever since, it has been an essential part of daily statistical practice, basically in all domains of application. From a mathematical point of view, the method is directly rooted in the linear algebra of positive semidefinite matrices and their spectral factorization.
|Title of host publication||Encyclopedia of Environmetrics, 2nd Edition|
|Editors||W. Piegorsch, A. El Shaarawi|
|Number of pages||3510|
|Publication status||Published - 2012|