Slice Sampling for General Completely Random Measures

Peiyuan Zhu, Alexandre Bouchard-Cote, Trevor Campbell
Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 124:699-708, 2020.

Abstract

Completely random measures provide a principled approach to creating flexible unsupervised models, where the number of latent features is infinite and the number of features that influence the data grows with the size of the data set. Due to the infinity the latent features, posterior inference requires either marginalization—resulting in dependence structures that prevent efficient computation via parallelization and conjugacy—or finite truncation, which arbitrarily limits the flexibility of the model. In this paper we present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables, enabling efficient, parallelized computation without sacrificing flexibility. In contrast to past work that achieved this on a model-by-model basis, we provide a general recipe that is applicable to the broad class of completely random measure-based priors. The efficacy of the proposed algorithm is evaluated on several popular nonparametric models, demonstrating a higher effective sample size per second compared to algorithms using marginalization as well as a higher predictive performance compared to models employing fixed truncations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v124-zhu20a, title = {Slice Sampling for General Completely Random Measures}, author = {Zhu, Peiyuan and Bouchard-Cote, Alexandre and Campbell, Trevor}, booktitle = {Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI)}, pages = {699--708}, year = {2020}, editor = {Jonas Peters and David Sontag}, volume = {124}, series = {Proceedings of Machine Learning Research}, month = {03--06 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v124/zhu20a/zhu20a.pdf}, url = { http://proceedings.mlr.press/v124/zhu20a.html }, abstract = {Completely random measures provide a principled approach to creating flexible unsupervised models, where the number of latent features is infinite and the number of features that influence the data grows with the size of the data set. Due to the infinity the latent features, posterior inference requires either marginalization—resulting in dependence structures that prevent efficient computation via parallelization and conjugacy—or finite truncation, which arbitrarily limits the flexibility of the model. In this paper we present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables, enabling efficient, parallelized computation without sacrificing flexibility. In contrast to past work that achieved this on a model-by-model basis, we provide a general recipe that is applicable to the broad class of completely random measure-based priors. The efficacy of the proposed algorithm is evaluated on several popular nonparametric models, demonstrating a higher effective sample size per second compared to algorithms using marginalization as well as a higher predictive performance compared to models employing fixed truncations.} }
Endnote
%0 Conference Paper %T Slice Sampling for General Completely Random Measures %A Peiyuan Zhu %A Alexandre Bouchard-Cote %A Trevor Campbell %B Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI) %C Proceedings of Machine Learning Research %D 2020 %E Jonas Peters %E David Sontag %F pmlr-v124-zhu20a %I PMLR %P 699--708 %U http://proceedings.mlr.press/v124/zhu20a.html %V 124 %X Completely random measures provide a principled approach to creating flexible unsupervised models, where the number of latent features is infinite and the number of features that influence the data grows with the size of the data set. Due to the infinity the latent features, posterior inference requires either marginalization—resulting in dependence structures that prevent efficient computation via parallelization and conjugacy—or finite truncation, which arbitrarily limits the flexibility of the model. In this paper we present a novel Markov chain Monte Carlo algorithm for posterior inference that adaptively sets the truncation level using auxiliary slice variables, enabling efficient, parallelized computation without sacrificing flexibility. In contrast to past work that achieved this on a model-by-model basis, we provide a general recipe that is applicable to the broad class of completely random measure-based priors. The efficacy of the proposed algorithm is evaluated on several popular nonparametric models, demonstrating a higher effective sample size per second compared to algorithms using marginalization as well as a higher predictive performance compared to models employing fixed truncations.
APA
Zhu, P., Bouchard-Cote, A. & Campbell, T.. (2020). Slice Sampling for General Completely Random Measures. Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), in Proceedings of Machine Learning Research 124:699-708 Available from http://proceedings.mlr.press/v124/zhu20a.html .

Related Material