`Regression anytime' with brute-force SVD truncation

Christian Bender, Nikolaus Schweizer

Research output: Contribution to journalArticleScientificpeer-review


We propose a new least-squares Monte Carlo algorithm for the approximation of conditional expectations in the presence of stochastic derivative weights. The algorithm can serve as a building block for solving dynamic programming equations, which arise, for example, in nonlinear option pricing problems or in probabilistic discretization schemes for fully nonlinear parabolic partial differential equations. Our algorithm can be generically applied when the underlying dynamics stem from an Euler approximation to a stochastic differential equation. A built-in variance reduction ensures that the convergence in the number of samples to the true regression function takes place at an arbitrarily fast polynomial rate, if the problem under consideration is smooth enough.

Original languageEnglish
Pages (from-to)1140-1179
JournalAnnals of Applied Probability
Issue number3
Publication statusPublished - Jun 2021


  • BSDEs
  • dynamic programming
  • Least-squares Monte Carlo
  • Monte Carlo simulation
  • quantitative finance
  • regression later
  • statistical learning


Dive into the research topics of '`Regression anytime' with brute-force SVD truncation'. Together they form a unique fingerprint.

Cite this