# Submodular Point Processes with Applications to Machine learning

[edit]

Rishabh Iyer,
Jeffrey Bilmes
;

Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:388-397, 2015.

#### Abstract

We introduce a class of discrete point processes that we call the \emphSubmodular Point Processes (SPPs). These processes are characterized via a submodular (or supermodular) function, and naturally model notions of \emphinformation, coverage and \emphdiversity, as well as \emphcooperation. Unlike Log-submodular and Log-supermodular distributions (Log-SPPs) such as determinantal point processes (DPPs), SPPs are themselves submodular (or supermodular). In this paper, we analyze the computational complexity of probabilistic inference in SPPs. We show that computing the partition function for SPPs (and Log-SPPs), requires exponential complexity in the worst case, and also provide algorithms which approximate SPPs up to polynomial factors. Moreover, for several subclasses of interesting submodular functions that occur in applications, we show how we can provide efficient closed form expressions for the partition functions, and thereby marginals and conditional distributions. We also show how SPPs are closed under mixtures, thus enabling maximum likelihood based strategies for learning mixtures of submodular functions. Finally, we argue how SPPs complement existing Log-SPP distributions, and are a natural model for several applications.

#### Related Material