Convergence Bounds for Sequential Monte Carlo on Multimodal Distributions using Soft Decomposition
We prove bounds on the variance of a function $f$ under the empirical measure of the samples obtained by the Sequential Monte Carlo (SMC) algorithm, with time complexity depending on local rather than global Markov chain mixing dynamics. SMC is a Markov Chain Monte Carlo (MCMC) method, which starts...
Saved in:
Main Authors | , |
---|---|
Format | Journal Article |
Language | English |
Published |
29.05.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We prove bounds on the variance of a function $f$ under the empirical measure
of the samples obtained by the Sequential Monte Carlo (SMC) algorithm, with
time complexity depending on local rather than global Markov chain mixing
dynamics. SMC is a Markov Chain Monte Carlo (MCMC) method, which starts by
drawing $N$ particles from a known distribution, and then, through a sequence
of distributions, re-weights and re-samples the particles, at each instance
applying a Markov chain for smoothing. In principle, SMC tries to alleviate
problems from multi-modality. However, most theoretical guarantees for SMC are
obtained by assuming global mixing time bounds, which are only efficient in the
uni-modal setting. We show that bounds can be obtained in the truly multi-modal
setting, with mixing times that depend only on local MCMC dynamics. |
---|---|
DOI: | 10.48550/arxiv.2405.19553 |