`Regression anytime' with brute-force SVD truncation

Christian Bender, Nikolaus Schweizer

Research output: Contribution to journalArticleScientificpeer-review


We propose a new least-squares Monte Carlo algorithm for the approximation of conditional expectations in the presence of stochastic derivative weights. The algorithm can serve as a building block for solving dynamic programming equations, which arise, e.g., in non-linear option pricing problems or in probabilistic discretization schemes for fully non-linear parabolic partial differential equations. Our algorithm can be generically applied when the underlying dynamics stem from an Euler approximation to a stochastic differential equation. A built-in variance reduction ensures that the convergence in the number of samples to the true regression function takes place at an arbitrarily fast polynomial rate, if the problem under consideration is smooth enough.
Original languageEnglish
JournalAnnals of Applied Probability
Publication statusAccepted/In press - 2020


Dive into the research topics of '`Regression anytime' with brute-force SVD truncation'. Together they form a unique fingerprint.

Cite this