Amortized Variational Inference with Graph Convolutional Networks for Gaussian Processes

Linfeng Liu, Liping Liu
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2291-2300, 2019.

Abstract

GP Inference on large datasets is computationally expensive, especially when the observation likelihood is non-Gaussian. To reduce the computation, many recent variational inference methods define the variational distribution based on a small number of inducing points. These methods have a hard tradeoff between distribution flexibility and computational efficiency. In this paper, we focus on the approximation of GP posterior at a local level: we define a reusable template to approximate the posterior at neighborhoods while maintaining a global approximation. We first construct a variational distribution such that the inference for a data point considers only its neighborhood, thereby separating the calculation for each data point. We then train Graph Convolutional Networks as a reusable model to run inference for each data point. Comparing to previous methods, our method greatly reduces the number of parameters and also the number of optimization iterations. In empirical evaluations, the proposed method significantly speeds up the inference and often gets more accurate results than competing methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-liu19c, title = {Amortized Variational Inference with Graph Convolutional Networks for Gaussian Processes}, author = {Liu, Linfeng and Liu, Liping}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2291--2300}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/liu19c/liu19c.pdf}, url = {https://proceedings.mlr.press/v89/liu19c.html}, abstract = {GP Inference on large datasets is computationally expensive, especially when the observation likelihood is non-Gaussian. To reduce the computation, many recent variational inference methods define the variational distribution based on a small number of inducing points. These methods have a hard tradeoff between distribution flexibility and computational efficiency. In this paper, we focus on the approximation of GP posterior at a local level: we define a reusable template to approximate the posterior at neighborhoods while maintaining a global approximation. We first construct a variational distribution such that the inference for a data point considers only its neighborhood, thereby separating the calculation for each data point. We then train Graph Convolutional Networks as a reusable model to run inference for each data point. Comparing to previous methods, our method greatly reduces the number of parameters and also the number of optimization iterations. In empirical evaluations, the proposed method significantly speeds up the inference and often gets more accurate results than competing methods.} }
Endnote
%0 Conference Paper %T Amortized Variational Inference with Graph Convolutional Networks for Gaussian Processes %A Linfeng Liu %A Liping Liu %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-liu19c %I PMLR %P 2291--2300 %U https://proceedings.mlr.press/v89/liu19c.html %V 89 %X GP Inference on large datasets is computationally expensive, especially when the observation likelihood is non-Gaussian. To reduce the computation, many recent variational inference methods define the variational distribution based on a small number of inducing points. These methods have a hard tradeoff between distribution flexibility and computational efficiency. In this paper, we focus on the approximation of GP posterior at a local level: we define a reusable template to approximate the posterior at neighborhoods while maintaining a global approximation. We first construct a variational distribution such that the inference for a data point considers only its neighborhood, thereby separating the calculation for each data point. We then train Graph Convolutional Networks as a reusable model to run inference for each data point. Comparing to previous methods, our method greatly reduces the number of parameters and also the number of optimization iterations. In empirical evaluations, the proposed method significantly speeds up the inference and often gets more accurate results than competing methods.
APA
Liu, L. & Liu, L.. (2019). Amortized Variational Inference with Graph Convolutional Networks for Gaussian Processes. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2291-2300 Available from https://proceedings.mlr.press/v89/liu19c.html.

Related Material