Rates of estimation for determinantal point processes
[edit]
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:343345, 2017.
Abstract
Determinantal point processes (DPPs) have wideranging applications in machine learning, where they are used to enforce the notion of diversity in subset selection problems. Many estimators have been proposed, but surprisingly the basic properties of the maximum likelihood estimator (MLE) have received little attention. In this paper, we study the local geometry of the expected loglikelihood function to prove several rates of convergence for the MLE. We also give a complete characterization of the case where the MLE converges at a parametric rate. Even in the latter case, we also exhibit a potential curse of dimensionality where the asymptotic variance of the MLE is exponentially large in the dimension of the problem.
Related Material


