Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps

Simon Apers, Sander Gribling, Daniel Szilagyi

Research output: Contribution to journalArticleScientificpeer-review

18 Downloads (Pure)

Abstract

Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high- dimensional distribution with density e( -f ( x )) , given access to the gradient of f . A particular case of interest is that of a d-dimensional Gaussian distribution with covariance matrix Sigma, in which case f (x) = x(T)Sigma (- 1) x . We show that Metropolis-adjusted HMC can sample from a distribution that is epsilon-close to a Gaussian in total variation distance using O e ( root kappa d( 1 / 4) log(1 /epsilon )) gradient queries, where epsilon > 0 and kappa is the condition number of . Our algorithm uses long and random integration times for the Hamiltonian dynamics, and it creates a warm start by first running HMC without a Metropolis adjustment. This contrasts with (and was motivated by) recent results that give an Omega e(kappa d(1/2)) query lower bound for HMC with a fixed integration times or from a cold start, even for the Gaussian case.
Original languageEnglish
Article number348
Number of pages30
JournalJournal of Machine Learning Research
Volume25
Publication statusPublished - Jan 2024

Keywords

  • Hamiltonian Monte Carlo
  • Markov chains
  • Metropolis-Hastings algorithm
  • Logconcave sampling
  • Numerical Integration

Fingerprint

Dive into the research topics of 'Hamiltonian Monte Carlo for efficient Gaussian sampling: long and random steps'. Together they form a unique fingerprint.

Cite this