Submodular Point Processes with Applications to Machine learning

Rishabh Iyer, Jeffrey Bilmes
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:388-397, 2015.

Abstract

We introduce a class of discrete point processes that we call the \emphSubmodular Point Processes (SPPs). These processes are characterized via a submodular (or supermodular) function, and naturally model notions of \emphinformation, coverage and \emphdiversity, as well as \emphcooperation. Unlike Log-submodular and Log-supermodular distributions (Log-SPPs) such as determinantal point processes (DPPs), SPPs are themselves submodular (or supermodular). In this paper, we analyze the computational complexity of probabilistic inference in SPPs. We show that computing the partition function for SPPs (and Log-SPPs), requires exponential complexity in the worst case, and also provide algorithms which approximate SPPs up to polynomial factors. Moreover, for several subclasses of interesting submodular functions that occur in applications, we show how we can provide efficient closed form expressions for the partition functions, and thereby marginals and conditional distributions. We also show how SPPs are closed under mixtures, thus enabling maximum likelihood based strategies for learning mixtures of submodular functions. Finally, we argue how SPPs complement existing Log-SPP distributions, and are a natural model for several applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-iyer15, title = {{Submodular Point Processes with Applications to Machine learning}}, author = {Iyer, Rishabh and Bilmes, Jeffrey}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {388--397}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/iyer15.pdf}, url = {https://proceedings.mlr.press/v38/iyer15.html}, abstract = {We introduce a class of discrete point processes that we call the \emphSubmodular Point Processes (SPPs). These processes are characterized via a submodular (or supermodular) function, and naturally model notions of \emphinformation, coverage and \emphdiversity, as well as \emphcooperation. Unlike Log-submodular and Log-supermodular distributions (Log-SPPs) such as determinantal point processes (DPPs), SPPs are themselves submodular (or supermodular). In this paper, we analyze the computational complexity of probabilistic inference in SPPs. We show that computing the partition function for SPPs (and Log-SPPs), requires exponential complexity in the worst case, and also provide algorithms which approximate SPPs up to polynomial factors. Moreover, for several subclasses of interesting submodular functions that occur in applications, we show how we can provide efficient closed form expressions for the partition functions, and thereby marginals and conditional distributions. We also show how SPPs are closed under mixtures, thus enabling maximum likelihood based strategies for learning mixtures of submodular functions. Finally, we argue how SPPs complement existing Log-SPP distributions, and are a natural model for several applications.} }
Endnote
%0 Conference Paper %T Submodular Point Processes with Applications to Machine learning %A Rishabh Iyer %A Jeffrey Bilmes %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-iyer15 %I PMLR %P 388--397 %U https://proceedings.mlr.press/v38/iyer15.html %V 38 %X We introduce a class of discrete point processes that we call the \emphSubmodular Point Processes (SPPs). These processes are characterized via a submodular (or supermodular) function, and naturally model notions of \emphinformation, coverage and \emphdiversity, as well as \emphcooperation. Unlike Log-submodular and Log-supermodular distributions (Log-SPPs) such as determinantal point processes (DPPs), SPPs are themselves submodular (or supermodular). In this paper, we analyze the computational complexity of probabilistic inference in SPPs. We show that computing the partition function for SPPs (and Log-SPPs), requires exponential complexity in the worst case, and also provide algorithms which approximate SPPs up to polynomial factors. Moreover, for several subclasses of interesting submodular functions that occur in applications, we show how we can provide efficient closed form expressions for the partition functions, and thereby marginals and conditional distributions. We also show how SPPs are closed under mixtures, thus enabling maximum likelihood based strategies for learning mixtures of submodular functions. Finally, we argue how SPPs complement existing Log-SPP distributions, and are a natural model for several applications.
RIS
TY - CPAPER TI - Submodular Point Processes with Applications to Machine learning AU - Rishabh Iyer AU - Jeffrey Bilmes BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-iyer15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 388 EP - 397 L1 - http://proceedings.mlr.press/v38/iyer15.pdf UR - https://proceedings.mlr.press/v38/iyer15.html AB - We introduce a class of discrete point processes that we call the \emphSubmodular Point Processes (SPPs). These processes are characterized via a submodular (or supermodular) function, and naturally model notions of \emphinformation, coverage and \emphdiversity, as well as \emphcooperation. Unlike Log-submodular and Log-supermodular distributions (Log-SPPs) such as determinantal point processes (DPPs), SPPs are themselves submodular (or supermodular). In this paper, we analyze the computational complexity of probabilistic inference in SPPs. We show that computing the partition function for SPPs (and Log-SPPs), requires exponential complexity in the worst case, and also provide algorithms which approximate SPPs up to polynomial factors. Moreover, for several subclasses of interesting submodular functions that occur in applications, we show how we can provide efficient closed form expressions for the partition functions, and thereby marginals and conditional distributions. We also show how SPPs are closed under mixtures, thus enabling maximum likelihood based strategies for learning mixtures of submodular functions. Finally, we argue how SPPs complement existing Log-SPP distributions, and are a natural model for several applications. ER -
APA
Iyer, R. & Bilmes, J.. (2015). Submodular Point Processes with Applications to Machine learning. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:388-397 Available from https://proceedings.mlr.press/v38/iyer15.html.

Related Material