Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables

Qi Wang, Herke Van Hoof
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:10018-10028, 2020.

Abstract

Neural processes (NPs) constitute a family of variational approximate models for stochastic processes with promising properties in computational efficiency and uncertainty quantification. These processes use neural networks with latent variable inputs to induce a predictive distribution. However, the expressiveness of vanilla NPs is limited as they only use a global latent variable, while target-specific local variation may be crucial sometimes. To address this challenge, we investigate NPs systematically and present a new variant of NP model that we call Doubly Stochastic Variational Neural Process (DSVNP). This model combines the global latent variable and local latent variables for prediction. We evaluate this model in several experiments, and our results demonstrate competitive prediction performance in multi-output regression and uncertainty estimation in classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-wang20s, title = {Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables}, author = {Wang, Qi and Van Hoof, Herke}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {10018--10028}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/wang20s/wang20s.pdf}, url = {https://proceedings.mlr.press/v119/wang20s.html}, abstract = {Neural processes (NPs) constitute a family of variational approximate models for stochastic processes with promising properties in computational efficiency and uncertainty quantification. These processes use neural networks with latent variable inputs to induce a predictive distribution. However, the expressiveness of vanilla NPs is limited as they only use a global latent variable, while target-specific local variation may be crucial sometimes. To address this challenge, we investigate NPs systematically and present a new variant of NP model that we call Doubly Stochastic Variational Neural Process (DSVNP). This model combines the global latent variable and local latent variables for prediction. We evaluate this model in several experiments, and our results demonstrate competitive prediction performance in multi-output regression and uncertainty estimation in classification.} }
Endnote
%0 Conference Paper %T Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables %A Qi Wang %A Herke Van Hoof %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-wang20s %I PMLR %P 10018--10028 %U https://proceedings.mlr.press/v119/wang20s.html %V 119 %X Neural processes (NPs) constitute a family of variational approximate models for stochastic processes with promising properties in computational efficiency and uncertainty quantification. These processes use neural networks with latent variable inputs to induce a predictive distribution. However, the expressiveness of vanilla NPs is limited as they only use a global latent variable, while target-specific local variation may be crucial sometimes. To address this challenge, we investigate NPs systematically and present a new variant of NP model that we call Doubly Stochastic Variational Neural Process (DSVNP). This model combines the global latent variable and local latent variables for prediction. We evaluate this model in several experiments, and our results demonstrate competitive prediction performance in multi-output regression and uncertainty estimation in classification.
APA
Wang, Q. & Van Hoof, H.. (2020). Doubly Stochastic Variational Inference for Neural Processes with Hierarchical Latent Variables. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:10018-10028 Available from https://proceedings.mlr.press/v119/wang20s.html.

Related Material