Replica Conditional Sequential Monte Carlo

Alex Shestopaloff, Arnaud Doucet
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5749-5757, 2019.

Abstract

We propose a Markov chain Monte Carlo (MCMC) scheme to perform state inference in non-linear non-Gaussian state-space models. Current state-of-the-art methods to address this problem rely on particle MCMC techniques and its variants, such as the iterated conditional Sequential Monte Carlo (cSMC) scheme, which uses a Sequential Monte Carlo (SMC) type proposal within MCMC. A deficiency of standard SMC proposals is that they only use observations up to time $t$ to propose states at time $t$ when an entire observation sequence is available. More sophisticated SMC based on lookahead techniques could be used but they can be difficult to put in practice. We propose here replica cSMC where we build SMC proposals for one replica using information from the entire observation sequence by conditioning on the states of the other replicas. This approach is easily parallelizable and we demonstrate its excellent empirical performance when compared to the standard iterated cSMC scheme at fixed computational complexity.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-shestopaloff19a, title = {Replica Conditional Sequential {M}onte {C}arlo}, author = {Shestopaloff, Alex and Doucet, Arnaud}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5749--5757}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/shestopaloff19a/shestopaloff19a.pdf}, url = {https://proceedings.mlr.press/v97/shestopaloff19a.html}, abstract = {We propose a Markov chain Monte Carlo (MCMC) scheme to perform state inference in non-linear non-Gaussian state-space models. Current state-of-the-art methods to address this problem rely on particle MCMC techniques and its variants, such as the iterated conditional Sequential Monte Carlo (cSMC) scheme, which uses a Sequential Monte Carlo (SMC) type proposal within MCMC. A deficiency of standard SMC proposals is that they only use observations up to time $t$ to propose states at time $t$ when an entire observation sequence is available. More sophisticated SMC based on lookahead techniques could be used but they can be difficult to put in practice. We propose here replica cSMC where we build SMC proposals for one replica using information from the entire observation sequence by conditioning on the states of the other replicas. This approach is easily parallelizable and we demonstrate its excellent empirical performance when compared to the standard iterated cSMC scheme at fixed computational complexity.} }
Endnote
%0 Conference Paper %T Replica Conditional Sequential Monte Carlo %A Alex Shestopaloff %A Arnaud Doucet %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-shestopaloff19a %I PMLR %P 5749--5757 %U https://proceedings.mlr.press/v97/shestopaloff19a.html %V 97 %X We propose a Markov chain Monte Carlo (MCMC) scheme to perform state inference in non-linear non-Gaussian state-space models. Current state-of-the-art methods to address this problem rely on particle MCMC techniques and its variants, such as the iterated conditional Sequential Monte Carlo (cSMC) scheme, which uses a Sequential Monte Carlo (SMC) type proposal within MCMC. A deficiency of standard SMC proposals is that they only use observations up to time $t$ to propose states at time $t$ when an entire observation sequence is available. More sophisticated SMC based on lookahead techniques could be used but they can be difficult to put in practice. We propose here replica cSMC where we build SMC proposals for one replica using information from the entire observation sequence by conditioning on the states of the other replicas. This approach is easily parallelizable and we demonstrate its excellent empirical performance when compared to the standard iterated cSMC scheme at fixed computational complexity.
APA
Shestopaloff, A. & Doucet, A.. (2019). Replica Conditional Sequential Monte Carlo. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5749-5757 Available from https://proceedings.mlr.press/v97/shestopaloff19a.html.

Related Material