[edit]
Nyström Approximation for Large-Scale Determinantal Processes
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:85-98, 2013.
Abstract
Determinantal point processes (DPPs) are appealing models for subset selection problems where diversity is desired. They offer surprisingly efficient inference, including sampling in $O(N^3)$ time and $O(N^2)$ space, where $N$ is the number of base items. However, in some applications, $N$ may grow so large that sampling from a DPP becomes computationally infeasible. This is especially true in settings where the DPP kernel matrix cannot be represented by a linear decomposition of low-dimensional feature vectors. In these cases, we propose applying the Nystrom approximation to project the kernel matrix into a low-dimensional space. While theoretical guarantees for the Nystrom approximation in terms of standard matrix norms have been previously established, we are concerned with probabilistic measures, like total variation distance between the DPP generated by a kernel matrix and the one generated by its Nystrom approximation, that behave quite differently. In this paper we derive new error bounds for the Nystrom-approximated DPP and present empirical results to corroborate them. We then demonstrate the Nystrom-approximated DPP by applying it to a motion capture summarization task.