De-Sequentialized Monte Carlo: a parallel-in-time particle smoother

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
Date
2022-08-01
Major/Subject
Mcode
Degree programme
Language
en
Pages
39
Series
Journal of Machine Learning Research, Volume 23
Abstract
Particle smoothers are SMC (Sequential Monte Carlo) algorithms designed to approximate the joint distribution of the states given observations from a state-space model. We propose dSMC (de-Sequentialized Monte Carlo), a new particle smoother that is able to process T observations in O(log T) time on parallel architectures. This compares favorably with standard particle smoothers, the complexity of which is linear in T. We derive Lp convergence results for dSMC, with an explicit upper bound, polynomial in T. We then discuss how to reduce the variance of the smoothing estimates computed by dSMC by (i) designing good proposal distributions for sampling the particles at the initialization of the algorithm, as well as by (ii) using lazy resampling to increase the number of particles used in dSMC. Finally, we design a particle Gibbs sampler based on dSMC, which is able to perform parameter inference in a state-space model at a O(log T) cost on parallel hardware.
Description
Keywords
Other note
Citation
Corenflos , A , Särkkä , S & Chopin , N 2022 , ' De-Sequentialized Monte Carlo: a parallel-in-time particle smoother ' , Journal of Machine Learning Research , vol. 23 . < https://www.jmlr.org/papers/v23/22-0140.html >