When Does Subagging Work?

Research output: Working paperOther research output

Abstract

We study the effectiveness of subagging, or subsample aggregating, on regression trees, a
popular non-parametric method in machine learning. First, we give sufficient conditions
for pointwise consistency of trees. We formalize that (i) the bias depends on the diameter
of cells, hence trees with few splits tend to be biased, and (ii) the variance depends on the
number of observations in cells, hence trees with many splits tend to have large variance.
While these statements for bias and variance are known to hold globally in the covariate
space, we show that, under some constraints, they are also true locally. Second, we compare
the performance of subagging to that of trees across different numbers of splits. We find
that (1) for any given number of splits, subagging improves upon a single tree, and (2)
this improvement is larger for many splits than it is for few splits. However, (3) a single
tree grown at optimal size can outperform subagging if the size of its individual trees
is not optimally chosen. This last result goes against common practice of growing large
randomized trees to eliminate bias and then averaging to reduce variance.
Original languageEnglish
Pages1-29
Number of pages29
DOIs
Publication statusPublished - 2 Apr 2024

Publication series

NameArXiv Pre-print

Keywords

  • regression trees
  • pointwise consistency
  • bias-variance trade-off
  • bagging
  • CART
  • performance at optimal sizes
  • performance across sizes

Fingerprint

Dive into the research topics of 'When Does Subagging Work?'. Together they form a unique fingerprint.

Cite this