Large-Scale Cox Process Inference using Variational Fourier Features

ST John, James Hensman
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:2362-2370, 2018.

Abstract

Gaussian process modulated Poisson processes provide a flexible framework for modeling spatiotemporal point patterns. So far this had been restricted to one dimension, binning to a pre-determined grid, or small data sets of up to a few thousand data points. Here we introduce Cox process inference based on Fourier features. This sparse representation induces global rather than local constraints on the function space and is computationally efficient. This allows us to formulate a grid-free approximation that scales well with the number of data points and the size of the domain. We demonstrate that this allows MCMC approximations to the non-Gaussian posterior. In practice, we find that Fourier features have more consistent optimization behavior than previous approaches. Our approximate Bayesian method can fit over 100 000 events with complex spatiotemporal patterns in three dimensions on a single GPU.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-john18a, title = {Large-Scale {C}ox Process Inference using Variational {F}ourier Features}, author = {John, ST and Hensman, James}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {2362--2370}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/john18a/john18a.pdf}, url = {https://proceedings.mlr.press/v80/john18a.html}, abstract = {Gaussian process modulated Poisson processes provide a flexible framework for modeling spatiotemporal point patterns. So far this had been restricted to one dimension, binning to a pre-determined grid, or small data sets of up to a few thousand data points. Here we introduce Cox process inference based on Fourier features. This sparse representation induces global rather than local constraints on the function space and is computationally efficient. This allows us to formulate a grid-free approximation that scales well with the number of data points and the size of the domain. We demonstrate that this allows MCMC approximations to the non-Gaussian posterior. In practice, we find that Fourier features have more consistent optimization behavior than previous approaches. Our approximate Bayesian method can fit over 100 000 events with complex spatiotemporal patterns in three dimensions on a single GPU.} }
Endnote
%0 Conference Paper %T Large-Scale Cox Process Inference using Variational Fourier Features %A ST John %A James Hensman %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-john18a %I PMLR %P 2362--2370 %U https://proceedings.mlr.press/v80/john18a.html %V 80 %X Gaussian process modulated Poisson processes provide a flexible framework for modeling spatiotemporal point patterns. So far this had been restricted to one dimension, binning to a pre-determined grid, or small data sets of up to a few thousand data points. Here we introduce Cox process inference based on Fourier features. This sparse representation induces global rather than local constraints on the function space and is computationally efficient. This allows us to formulate a grid-free approximation that scales well with the number of data points and the size of the domain. We demonstrate that this allows MCMC approximations to the non-Gaussian posterior. In practice, we find that Fourier features have more consistent optimization behavior than previous approaches. Our approximate Bayesian method can fit over 100 000 events with complex spatiotemporal patterns in three dimensions on a single GPU.
APA
John, S. & Hensman, J.. (2018). Large-Scale Cox Process Inference using Variational Fourier Features. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:2362-2370 Available from https://proceedings.mlr.press/v80/john18a.html.

Related Material