Fast Mixing for Discrete Point Processes

Patrick Rebeschini, Amin Karbasi
Proceedings of The 28th Conference on Learning Theory, PMLR 40:1480-1500, 2015.

Abstract

We investigate the systematic mechanism for designing fast mixing Markov chain Monte Carlo algorithms to sample from discrete point processes under the Dobrushin uniqueness condition for Gibbs measures. Discrete point processes are defined as probability distributions μ(S)∝\exp(βf(S)) over all subsets S∈2^V of a finite set V through a bounded set function f:2^V→\mathbbR and a parameter β>0. A subclass of discrete point processes characterized by submodular functions (which include log-submodular distributions, submodular point processes, and determinantal point processes) has recently gained a lot of interest in machine learning and shown to be effective for modeling diversity and coverage. We show that if the set function (not necessarily submodular) displays a natural notion of decay of correlation, then, for βsmall enough, it is possible to design fast mixing Markov chain Monte Carlo methods that yield error bounds on marginal approximations that do not depend on the size of the set V. The sufficient conditions that we derive involve a control on the (discrete) Hessian of set functions, a quantity that has not been previously considered in the literature. We specialize our results for submodular functions, and we discuss canonical examples where the Hessian can be easily controlled.

Cite this Paper


BibTeX
@InProceedings{pmlr-v40-Rebeschini15, title = {Fast Mixing for Discrete Point Processes}, author = {Rebeschini, Patrick and Karbasi, Amin}, booktitle = {Proceedings of The 28th Conference on Learning Theory}, pages = {1480--1500}, year = {2015}, editor = {Grünwald, Peter and Hazan, Elad and Kale, Satyen}, volume = {40}, series = {Proceedings of Machine Learning Research}, address = {Paris, France}, month = {03--06 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v40/Rebeschini15.pdf}, url = {https://proceedings.mlr.press/v40/Rebeschini15.html}, abstract = {We investigate the systematic mechanism for designing fast mixing Markov chain Monte Carlo algorithms to sample from discrete point processes under the Dobrushin uniqueness condition for Gibbs measures. Discrete point processes are defined as probability distributions μ(S)∝\exp(βf(S)) over all subsets S∈2^V of a finite set V through a bounded set function f:2^V→\mathbbR and a parameter β>0. A subclass of discrete point processes characterized by submodular functions (which include log-submodular distributions, submodular point processes, and determinantal point processes) has recently gained a lot of interest in machine learning and shown to be effective for modeling diversity and coverage. We show that if the set function (not necessarily submodular) displays a natural notion of decay of correlation, then, for βsmall enough, it is possible to design fast mixing Markov chain Monte Carlo methods that yield error bounds on marginal approximations that do not depend on the size of the set V. The sufficient conditions that we derive involve a control on the (discrete) Hessian of set functions, a quantity that has not been previously considered in the literature. We specialize our results for submodular functions, and we discuss canonical examples where the Hessian can be easily controlled.} }
Endnote
%0 Conference Paper %T Fast Mixing for Discrete Point Processes %A Patrick Rebeschini %A Amin Karbasi %B Proceedings of The 28th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2015 %E Peter Grünwald %E Elad Hazan %E Satyen Kale %F pmlr-v40-Rebeschini15 %I PMLR %P 1480--1500 %U https://proceedings.mlr.press/v40/Rebeschini15.html %V 40 %X We investigate the systematic mechanism for designing fast mixing Markov chain Monte Carlo algorithms to sample from discrete point processes under the Dobrushin uniqueness condition for Gibbs measures. Discrete point processes are defined as probability distributions μ(S)∝\exp(βf(S)) over all subsets S∈2^V of a finite set V through a bounded set function f:2^V→\mathbbR and a parameter β>0. A subclass of discrete point processes characterized by submodular functions (which include log-submodular distributions, submodular point processes, and determinantal point processes) has recently gained a lot of interest in machine learning and shown to be effective for modeling diversity and coverage. We show that if the set function (not necessarily submodular) displays a natural notion of decay of correlation, then, for βsmall enough, it is possible to design fast mixing Markov chain Monte Carlo methods that yield error bounds on marginal approximations that do not depend on the size of the set V. The sufficient conditions that we derive involve a control on the (discrete) Hessian of set functions, a quantity that has not been previously considered in the literature. We specialize our results for submodular functions, and we discuss canonical examples where the Hessian can be easily controlled.
RIS
TY - CPAPER TI - Fast Mixing for Discrete Point Processes AU - Patrick Rebeschini AU - Amin Karbasi BT - Proceedings of The 28th Conference on Learning Theory DA - 2015/06/26 ED - Peter Grünwald ED - Elad Hazan ED - Satyen Kale ID - pmlr-v40-Rebeschini15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 40 SP - 1480 EP - 1500 L1 - http://proceedings.mlr.press/v40/Rebeschini15.pdf UR - https://proceedings.mlr.press/v40/Rebeschini15.html AB - We investigate the systematic mechanism for designing fast mixing Markov chain Monte Carlo algorithms to sample from discrete point processes under the Dobrushin uniqueness condition for Gibbs measures. Discrete point processes are defined as probability distributions μ(S)∝\exp(βf(S)) over all subsets S∈2^V of a finite set V through a bounded set function f:2^V→\mathbbR and a parameter β>0. A subclass of discrete point processes characterized by submodular functions (which include log-submodular distributions, submodular point processes, and determinantal point processes) has recently gained a lot of interest in machine learning and shown to be effective for modeling diversity and coverage. We show that if the set function (not necessarily submodular) displays a natural notion of decay of correlation, then, for βsmall enough, it is possible to design fast mixing Markov chain Monte Carlo methods that yield error bounds on marginal approximations that do not depend on the size of the set V. The sufficient conditions that we derive involve a control on the (discrete) Hessian of set functions, a quantity that has not been previously considered in the literature. We specialize our results for submodular functions, and we discuss canonical examples where the Hessian can be easily controlled. ER -
APA
Rebeschini, P. & Karbasi, A.. (2015). Fast Mixing for Discrete Point Processes. Proceedings of The 28th Conference on Learning Theory, in Proceedings of Machine Learning Research 40:1480-1500 Available from https://proceedings.mlr.press/v40/Rebeschini15.html.

Related Material