### Abstract

*principal components*– ranked by decreasing variances. In case the original variables were strongly interrelated, the few first principal components, typically, will account for most of the variation in the original data. Restricting to those few components then allows for a reduction in the dimension of the dataset that retains most of the variability of the original one. Karl. Pearson is generally credited for introducing the method in 1901. The same method was rediscovered in 1933 and popularized by Hotelling; ever since, it has been an essential part of daily statistical practice, basically in all domains of application. From a mathematical point of view, the method is directly rooted in the linear algebra of positive semidefinite matrices and their spectral factorization.

Original language | English |
---|---|

Title of host publication | Encyclopedia of Environmetrics, 2nd Edition |

Editors | W. Piegorsch, A. El Shaarawi |

Publisher | Wiley |

Pages | 1987-1988 |

Number of pages | 3510 |

ISBN (Print) | 9780470973882 |

Publication status | Published - 2012 |

### Fingerprint

### Cite this

*Encyclopedia of Environmetrics, 2nd Edition*(pp. 1987-1988). Wiley.

}

*Encyclopedia of Environmetrics, 2nd Edition.*Wiley, pp. 1987-1988.

**Principal components.** / Hallin, M.; Hörmann, S.

Research output: Chapter in Book/Report/Conference proceeding › Chapter › Scientific › peer-review

TY - CHAP

T1 - Principal components

AU - Hallin, M.

AU - Hörmann, S.

N1 - Pagination: 3510

PY - 2012

Y1 - 2012

N2 - Principal Components are probably the best known and most widely used of all multivariate analysis techniques. The essential idea consists in performing a linear transformation of the observed k-dimensional variables in such a way that the new variables are vectors of k mutually orthogonal (uncorrelated) components – the principal components – ranked by decreasing variances. In case the original variables were strongly interrelated, the few first principal components, typically, will account for most of the variation in the original data. Restricting to those few components then allows for a reduction in the dimension of the dataset that retains most of the variability of the original one. Karl. Pearson is generally credited for introducing the method in 1901. The same method was rediscovered in 1933 and popularized by Hotelling; ever since, it has been an essential part of daily statistical practice, basically in all domains of application. From a mathematical point of view, the method is directly rooted in the linear algebra of positive semidefinite matrices and their spectral factorization.

AB - Principal Components are probably the best known and most widely used of all multivariate analysis techniques. The essential idea consists in performing a linear transformation of the observed k-dimensional variables in such a way that the new variables are vectors of k mutually orthogonal (uncorrelated) components – the principal components – ranked by decreasing variances. In case the original variables were strongly interrelated, the few first principal components, typically, will account for most of the variation in the original data. Restricting to those few components then allows for a reduction in the dimension of the dataset that retains most of the variability of the original one. Karl. Pearson is generally credited for introducing the method in 1901. The same method was rediscovered in 1933 and popularized by Hotelling; ever since, it has been an essential part of daily statistical practice, basically in all domains of application. From a mathematical point of view, the method is directly rooted in the linear algebra of positive semidefinite matrices and their spectral factorization.

M3 - Chapter

SN - 9780470973882

SP - 1987

EP - 1988

BT - Encyclopedia of Environmetrics, 2nd Edition

A2 - Piegorsch, W.

A2 - El Shaarawi, A.

PB - Wiley

ER -