Abstract
Perturbation theory for Markov chains addresses the question of how small differences in the transition probabilities of Markov chains are reflected in differences between their distributions. We prove powerful and flexible bounds on the distance of the nth step distributions of two Markov chains when one of them satisfies a Wasserstein ergodicity condition. Our work is motivated by the recent interest in approximate Markov chain Monte Carlo (MCMC) methods in the analysis of big data sets. By using an approach based on Lyapunov functions, we provide estimates for geometrically ergodic Markov chains under weak assumptions. In an autoregressive model, our bounds cannot be improved in general. We illustrate our theory by showing quantitative estimates for approximate versions of two prominent MCMC algorithms,
the Metropolis-Hastings and stochastic Langevin algorithms.
the Metropolis-Hastings and stochastic Langevin algorithms.
Original language | English |
---|---|
Pages (from-to) | 2610-2639 |
Journal | Bernoulli |
Volume | 24 |
Issue number | 4A |
DOIs | |
Publication status | Published - Nov 2018 |
Keywords
- perturbations
- Markov chains
- Wasserstein distance
- MCMC
- big data