GP-ConvCNP: Better generalization for conditional convolutional Neural Processes on time series data

Jens Petersen, Gregor Köhler, David Zimmerer, Fabian Isensee, Paul F. Jäger, Klaus H. Maier-Hein
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:939-949, 2021.

Abstract

Neural Processes (NPs) are a family of conditional generative models that are able to model a distribution over functions, in a way that allows them to perform predictions at test time conditioned on a number of context points. A recent addition to this family, Convolutional Conditional Neural Processes (ConvCNP), have shown remarkable improvement in performance over prior art, but we find that they sometimes struggle to generalize when applied to time series data. In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future. By incorporating a Gaussian Process into the model, we are able to remedy this and at the same time improve performance within distribution. As an added benefit, the Gaussian Process reintroduces the possibility to sample from the model, a key feature of other members in the NP family.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-petersen21a, title = {GP-ConvCNP: Better generalization for conditional convolutional Neural Processes on time series data}, author = {Petersen, Jens and K{\"o}hler, Gregor and Zimmerer, David and Isensee, Fabian and J{\"a}ger, Paul F. and Maier-Hein, Klaus H.}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {939--949}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/petersen21a/petersen21a.pdf}, url = {https://proceedings.mlr.press/v161/petersen21a.html}, abstract = {Neural Processes (NPs) are a family of conditional generative models that are able to model a distribution over functions, in a way that allows them to perform predictions at test time conditioned on a number of context points. A recent addition to this family, Convolutional Conditional Neural Processes (ConvCNP), have shown remarkable improvement in performance over prior art, but we find that they sometimes struggle to generalize when applied to time series data. In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future. By incorporating a Gaussian Process into the model, we are able to remedy this and at the same time improve performance within distribution. As an added benefit, the Gaussian Process reintroduces the possibility to sample from the model, a key feature of other members in the NP family.} }
Endnote
%0 Conference Paper %T GP-ConvCNP: Better generalization for conditional convolutional Neural Processes on time series data %A Jens Petersen %A Gregor Köhler %A David Zimmerer %A Fabian Isensee %A Paul F. Jäger %A Klaus H. Maier-Hein %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-petersen21a %I PMLR %P 939--949 %U https://proceedings.mlr.press/v161/petersen21a.html %V 161 %X Neural Processes (NPs) are a family of conditional generative models that are able to model a distribution over functions, in a way that allows them to perform predictions at test time conditioned on a number of context points. A recent addition to this family, Convolutional Conditional Neural Processes (ConvCNP), have shown remarkable improvement in performance over prior art, but we find that they sometimes struggle to generalize when applied to time series data. In particular, they are not robust to distribution shifts and fail to extrapolate observed patterns into the future. By incorporating a Gaussian Process into the model, we are able to remedy this and at the same time improve performance within distribution. As an added benefit, the Gaussian Process reintroduces the possibility to sample from the model, a key feature of other members in the NP family.
APA
Petersen, J., Köhler, G., Zimmerer, D., Isensee, F., Jäger, P.F. & Maier-Hein, K.H.. (2021). GP-ConvCNP: Better generalization for conditional convolutional Neural Processes on time series data. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:939-949 Available from https://proceedings.mlr.press/v161/petersen21a.html.

Related Material