Monotonic Gaussian Process Flows

Ivan Ustyuzhaninov, Ieva Kazlauskaite, Carl Henrik Ek, Neill Campbell
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:3057-3067, 2020.

Abstract

We propose a new framework for imposing monotonicity constraints in a Bayesian non-parametric setting based on numerical solutions of stochastic differential equations. We derive a nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty. We demonstrate the efficacy of the proposed model by providing competitive results to other probabilistic monotonic models on a number of benchmark functions. In addition, we consider the utility of a monotonic random process as a part of a hierarchical probabilistic model; we examine the task of temporal alignment of time-series data where it is beneficial to use a monotonic random process in order to preserve the uncertainty in the temporal warpings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-ustyuzhaninov20a, title = {Monotonic Gaussian Process Flows}, author = {Ustyuzhaninov, Ivan and Kazlauskaite, Ieva and Ek, Carl Henrik and Campbell, Neill}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {3057--3067}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/ustyuzhaninov20a/ustyuzhaninov20a.pdf}, url = { http://proceedings.mlr.press/v108/ustyuzhaninov20a.html }, abstract = {We propose a new framework for imposing monotonicity constraints in a Bayesian non-parametric setting based on numerical solutions of stochastic differential equations. We derive a nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty. We demonstrate the efficacy of the proposed model by providing competitive results to other probabilistic monotonic models on a number of benchmark functions. In addition, we consider the utility of a monotonic random process as a part of a hierarchical probabilistic model; we examine the task of temporal alignment of time-series data where it is beneficial to use a monotonic random process in order to preserve the uncertainty in the temporal warpings. } }
Endnote
%0 Conference Paper %T Monotonic Gaussian Process Flows %A Ivan Ustyuzhaninov %A Ieva Kazlauskaite %A Carl Henrik Ek %A Neill Campbell %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-ustyuzhaninov20a %I PMLR %P 3057--3067 %U http://proceedings.mlr.press/v108/ustyuzhaninov20a.html %V 108 %X We propose a new framework for imposing monotonicity constraints in a Bayesian non-parametric setting based on numerical solutions of stochastic differential equations. We derive a nonparametric model of monotonic functions that allows for interpretable priors and principled quantification of hierarchical uncertainty. We demonstrate the efficacy of the proposed model by providing competitive results to other probabilistic monotonic models on a number of benchmark functions. In addition, we consider the utility of a monotonic random process as a part of a hierarchical probabilistic model; we examine the task of temporal alignment of time-series data where it is beneficial to use a monotonic random process in order to preserve the uncertainty in the temporal warpings.
APA
Ustyuzhaninov, I., Kazlauskaite, I., Ek, C.H. & Campbell, N.. (2020). Monotonic Gaussian Process Flows. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:3057-3067 Available from http://proceedings.mlr.press/v108/ustyuzhaninov20a.html .

Related Material