Sparse Variational Inference for Generalized GP Models

Rishit Sheth, Yuyang Wang, Roni Khardon
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1302-1311, 2015.

Abstract

Gaussian processes (GP) provide an attractive machine learning model due to their non-parametric form, their flexibility to capture many types of observation data, and their generic inference procedures. Sparse GP inference algorithms address the cubic complexity of GPs by focusing on a small set of pseudo-samples. To date, such approaches have focused on the simple case of Gaussian observation likelihoods. This paper develops a variational sparse solution for GPs under general likelihoods by providing a new characterization of the gradients required for inference in terms of individual observation likelihood terms. In addition, we propose a simple new approach for optimizing the sparse variational approximation using a fixed point computation. We demonstrate experimentally that the fixed point operator acts as a contraction in many cases and therefore leads to fast convergence. An experimental evaluation for count regression, classification, and ordinal regression illustrates the generality and advantages of the new approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-sheth15, title = {Sparse Variational Inference for Generalized GP Models}, author = {Sheth, Rishit and Wang, Yuyang and Khardon, Roni}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1302--1311}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/sheth15.pdf}, url = {https://proceedings.mlr.press/v37/sheth15.html}, abstract = {Gaussian processes (GP) provide an attractive machine learning model due to their non-parametric form, their flexibility to capture many types of observation data, and their generic inference procedures. Sparse GP inference algorithms address the cubic complexity of GPs by focusing on a small set of pseudo-samples. To date, such approaches have focused on the simple case of Gaussian observation likelihoods. This paper develops a variational sparse solution for GPs under general likelihoods by providing a new characterization of the gradients required for inference in terms of individual observation likelihood terms. In addition, we propose a simple new approach for optimizing the sparse variational approximation using a fixed point computation. We demonstrate experimentally that the fixed point operator acts as a contraction in many cases and therefore leads to fast convergence. An experimental evaluation for count regression, classification, and ordinal regression illustrates the generality and advantages of the new approach.} }
Endnote
%0 Conference Paper %T Sparse Variational Inference for Generalized GP Models %A Rishit Sheth %A Yuyang Wang %A Roni Khardon %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-sheth15 %I PMLR %P 1302--1311 %U https://proceedings.mlr.press/v37/sheth15.html %V 37 %X Gaussian processes (GP) provide an attractive machine learning model due to their non-parametric form, their flexibility to capture many types of observation data, and their generic inference procedures. Sparse GP inference algorithms address the cubic complexity of GPs by focusing on a small set of pseudo-samples. To date, such approaches have focused on the simple case of Gaussian observation likelihoods. This paper develops a variational sparse solution for GPs under general likelihoods by providing a new characterization of the gradients required for inference in terms of individual observation likelihood terms. In addition, we propose a simple new approach for optimizing the sparse variational approximation using a fixed point computation. We demonstrate experimentally that the fixed point operator acts as a contraction in many cases and therefore leads to fast convergence. An experimental evaluation for count regression, classification, and ordinal regression illustrates the generality and advantages of the new approach.
RIS
TY - CPAPER TI - Sparse Variational Inference for Generalized GP Models AU - Rishit Sheth AU - Yuyang Wang AU - Roni Khardon BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-sheth15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1302 EP - 1311 L1 - http://proceedings.mlr.press/v37/sheth15.pdf UR - https://proceedings.mlr.press/v37/sheth15.html AB - Gaussian processes (GP) provide an attractive machine learning model due to their non-parametric form, their flexibility to capture many types of observation data, and their generic inference procedures. Sparse GP inference algorithms address the cubic complexity of GPs by focusing on a small set of pseudo-samples. To date, such approaches have focused on the simple case of Gaussian observation likelihoods. This paper develops a variational sparse solution for GPs under general likelihoods by providing a new characterization of the gradients required for inference in terms of individual observation likelihood terms. In addition, we propose a simple new approach for optimizing the sparse variational approximation using a fixed point computation. We demonstrate experimentally that the fixed point operator acts as a contraction in many cases and therefore leads to fast convergence. An experimental evaluation for count regression, classification, and ordinal regression illustrates the generality and advantages of the new approach. ER -
APA
Sheth, R., Wang, Y. & Khardon, R.. (2015). Sparse Variational Inference for Generalized GP Models. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1302-1311 Available from https://proceedings.mlr.press/v37/sheth15.html.

Related Material