Nested Sequential Monte Carlo Methods

Christian Naesseth, Fredrik Lindsten, Thomas Schon
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1292-1301, 2015.

Abstract

We propose nested sequential Monte Carlo (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, NSMC can in itself be used to produce such properly weighted samples. Consequently, one NSMC sampler can be used to construct an efficient high-dimensional proposal distribution for another NSMC sampler, and this nesting of the algorithm can be done to an arbitrary degree. This allows us to consider complex and high-dimensional models using SMC. We show results that motivate the efficacy of our approach on several filtering problems with dimensions in the order of 100 to 1000.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-naesseth15, title = {Nested Sequential Monte Carlo Methods}, author = {Naesseth, Christian and Lindsten, Fredrik and Schon, Thomas}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1292--1301}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/naesseth15.pdf}, url = {https://proceedings.mlr.press/v37/naesseth15.html}, abstract = {We propose nested sequential Monte Carlo (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, NSMC can in itself be used to produce such properly weighted samples. Consequently, one NSMC sampler can be used to construct an efficient high-dimensional proposal distribution for another NSMC sampler, and this nesting of the algorithm can be done to an arbitrary degree. This allows us to consider complex and high-dimensional models using SMC. We show results that motivate the efficacy of our approach on several filtering problems with dimensions in the order of 100 to 1000.} }
Endnote
%0 Conference Paper %T Nested Sequential Monte Carlo Methods %A Christian Naesseth %A Fredrik Lindsten %A Thomas Schon %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-naesseth15 %I PMLR %P 1292--1301 %U https://proceedings.mlr.press/v37/naesseth15.html %V 37 %X We propose nested sequential Monte Carlo (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, NSMC can in itself be used to produce such properly weighted samples. Consequently, one NSMC sampler can be used to construct an efficient high-dimensional proposal distribution for another NSMC sampler, and this nesting of the algorithm can be done to an arbitrary degree. This allows us to consider complex and high-dimensional models using SMC. We show results that motivate the efficacy of our approach on several filtering problems with dimensions in the order of 100 to 1000.
RIS
TY - CPAPER TI - Nested Sequential Monte Carlo Methods AU - Christian Naesseth AU - Fredrik Lindsten AU - Thomas Schon BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-naesseth15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1292 EP - 1301 L1 - http://proceedings.mlr.press/v37/naesseth15.pdf UR - https://proceedings.mlr.press/v37/naesseth15.html AB - We propose nested sequential Monte Carlo (NSMC), a methodology to sample from sequences of probability distributions, even where the random variables are high-dimensional. NSMC generalises the SMC framework by requiring only approximate, properly weighted, samples from the SMC proposal distribution, while still resulting in a correct SMC algorithm. Furthermore, NSMC can in itself be used to produce such properly weighted samples. Consequently, one NSMC sampler can be used to construct an efficient high-dimensional proposal distribution for another NSMC sampler, and this nesting of the algorithm can be done to an arbitrary degree. This allows us to consider complex and high-dimensional models using SMC. We show results that motivate the efficacy of our approach on several filtering problems with dimensions in the order of 100 to 1000. ER -
APA
Naesseth, C., Lindsten, F. & Schon, T.. (2015). Nested Sequential Monte Carlo Methods. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1292-1301 Available from https://proceedings.mlr.press/v37/naesseth15.html.

Related Material