Principal components

M. Hallin, S. Hörmann

    Research output: Chapter in Book/Report/Conference proceedingChapterScientificpeer-review

    Abstract

    Principal Components are probably the best known and most widely used of all multivariate analysis techniques. The essential idea consists in performing a linear transformation of the observed k-dimensional variables in such a way that the new variables are vectors of k mutually orthogonal (uncorrelated) components – the principal components – ranked by decreasing variances. In case the original variables were strongly interrelated, the few first principal components, typically, will account for most of the variation in the original data. Restricting to those few components then allows for a reduction in the dimension of the dataset that retains most of the variability of the original one. Karl. Pearson is generally credited for introducing the method in 1901. The same method was rediscovered in 1933 and popularized by Hotelling; ever since, it has been an essential part of daily statistical practice, basically in all domains of application. From a mathematical point of view, the method is directly rooted in the linear algebra of positive semidefinite matrices and their spectral factorization.
    Original languageEnglish
    Title of host publicationEncyclopedia of Environmetrics, 2nd Edition
    EditorsW. Piegorsch, A. El Shaarawi
    PublisherWiley
    Pages1987-1988
    Number of pages3510
    ISBN (Print)9780470973882
    Publication statusPublished - 2012

    Fingerprint

    Dive into the research topics of 'Principal components'. Together they form a unique fingerprint.

    Cite this