Non-separable Non-stationary random fields

Kangrui Wang, Oliver Hamelijnck, Theodoros Damoulas, Mark Steel
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9887-9897, 2020.

Abstract

We describe a framework for constructing nonstationary nonseparable random fields based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolution function is stationary we arrive at nonseparable random fields that have varying nonseparability and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and statistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-wang20g, title = {Non-separable Non-stationary random fields}, author = {Wang, Kangrui and Hamelijnck, Oliver and Damoulas, Theodoros and Steel, Mark}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9887--9897}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/wang20g/wang20g.pdf}, url = {https://proceedings.mlr.press/v119/wang20g.html}, abstract = {We describe a framework for constructing nonstationary nonseparable random fields based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolution function is stationary we arrive at nonseparable random fields that have varying nonseparability and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and statistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions.} }
Endnote
%0 Conference Paper %T Non-separable Non-stationary random fields %A Kangrui Wang %A Oliver Hamelijnck %A Theodoros Damoulas %A Mark Steel %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-wang20g %I PMLR %P 9887--9897 %U https://proceedings.mlr.press/v119/wang20g.html %V 119 %X We describe a framework for constructing nonstationary nonseparable random fields based on an infinite mixture of convolved stochastic processes. When the mixing process is stationary but the convolution function is nonstationary we arrive at nonseparable kernels with constant non-separability that are available in closed form. When the mixing is nonstationary and the convolution function is stationary we arrive at nonseparable random fields that have varying nonseparability and better preserve local structure. These fields have natural interpretations through the spectral representation of stochastic differential equations (SDEs) and are demonstrated on a range of synthetic benchmarks and spatio-temporal applications in geostatistics and machine learning. We show how a single Gaussian process (GP) with these random fields can computationally and statistically outperform both separable and existing nonstationary nonseparable approaches such as treed GPs and deep GP constructions.
APA
Wang, K., Hamelijnck, O., Damoulas, T. & Steel, M.. (2020). Non-separable Non-stationary random fields. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9887-9897 Available from https://proceedings.mlr.press/v119/wang20g.html.

Related Material