Faster Greedy MAP Inference for Determinantal Point Processes
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:13841393, 2017.
Abstract
Determinantal point processes (DPPs) are popular probabilistic models that arise in many machine learning tasks, where distributions of diverse sets are characterized by determinants of their features. In this paper, we develop fast algorithms to find the most likely configuration (MAP) of largescale DPPs, which is NPhard in general. Due to the submodular nature of the MAP objective, greedy algorithms have been used with empirical success. Greedy implementations require computation of logdeterminants, matrix inverses or solving linear systems at each iteration. We present faster implementations of the greedy algorithms by utilizing the orthogonal benefits of two logdeterminant approximation schemes: (a) firstorder expansions to the matrix logdeterminant function and (b) highorder expansions to the scalar log function with stochastic trace estimators. In our experiments, our algorithms are orders of magnitude faster than their competitors, while sacrificing marginal accuracy.
Related Material


