Abstract
Hamiltonian Monte Carlo (HMC) is a Markov chain algorithm for sampling from a high- dimensional distribution with density e( -f ( x )) , given access to the gradient of f . A particular case of interest is that of a d-dimensional Gaussian distribution with covariance matrix Sigma, in which case f (x) = x(T)Sigma (- 1) x . We show that Metropolis-adjusted HMC can sample from a distribution that is epsilon-close to a Gaussian in total variation distance using O e ( root kappa d( 1 / 4) log(1 /epsilon )) gradient queries, where epsilon > 0 and kappa is the condition number of . Our algorithm uses long and random integration times for the Hamiltonian dynamics, and it creates a warm start by first running HMC without a Metropolis adjustment. This contrasts with (and was motivated by) recent results that give an Omega e(kappa d(1/2)) query lower bound for HMC with a fixed integration times or from a cold start, even for the Gaussian case.
Original language | English |
---|---|
Article number | 348 |
Number of pages | 30 |
Journal | Journal of Machine Learning Research |
Volume | 25 |
Publication status | Published - Jan 2024 |
Keywords
- Hamiltonian Monte Carlo
- Markov chains
- Metropolis-Hastings algorithm
- Logconcave sampling
- Numerical Integration