Download
Abstract
Recent work has suggested using Monte Carlo methods based on piecewise deterministic Markov processes (PDMPs) to sample from target distributions of interest. PDMPs are non-reversible continuous-time processes endowed with momentum, and hence can mix better than standard reversible MCMC samplers. Furthermore, they can incorporate exact sub-sampling schemes which only require access to a single (randomly selected) data point at each iteration, yet without introducing bias to the algorithm’s stationary distribution. However, the range of models for which PDMPs can be used, particularly with sub-sampling, is limited. We propose approximate simulation of PDMPs with sub-sampling for scalable sampling from posterior distributions. The approximation takes the form of an Euler approximation to the true PDMP dynamics, and involves using an estimate of the gradient of the log-posterior based on a data sub-sample. We thus call this class of algorithms stochastic-gradient PDMPs. Importantly, the trajectories of stochastic-gradient PDMPs are continuous and can leverage recent ideas for sampling from measures with continuous and atomic components. We show these methods are easy to implement, present results on their approximation error and demonstrate numerically that this class of algorithms has similar efficiency to, but is more robust than, stochastic gradient Langevin dynamics.
Citation
Fearnhead, P., Grazzi, S., Nemeth, C. and Roberts, G. (2024). Stochastic Gradient Piecewise Deterministic Monte Carlo Samplers. arXiv preprint.
@article{fearnhead2024stochastic,
title={Stochastic Gradient Piecewise Deterministic Monte Carlo Samplers},
author={Fearnhead, Paul and Grazzi, Sebastiano and Nemeth, Chris and Roberts, Gareth O},
journal={arXiv preprint arXiv:2406.19051},
year={2024}
}