Scalable Gaussian Process Classification via Expectation Propagation

Daniel Hernandez-Lobato, Jose Miguel Hernandez-Lobato
; Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:168-176, 2016.

Abstract

Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-hernandez-lobato16, title = {Scalable Gaussian Process Classification via Expectation Propagation}, author = {Daniel Hernandez-Lobato and Jose Miguel Hernandez-Lobato}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {168--176}, year = {2016}, editor = {Arthur Gretton and Christian C. Robert}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/hernandez-lobato16.pdf}, url = {http://proceedings.mlr.press/v51/hernandez-lobato16.html}, abstract = {Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.} }
Endnote
%0 Conference Paper %T Scalable Gaussian Process Classification via Expectation Propagation %A Daniel Hernandez-Lobato %A Jose Miguel Hernandez-Lobato %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-hernandez-lobato16 %I PMLR %J Proceedings of Machine Learning Research %P 168--176 %U http://proceedings.mlr.press %V 51 %W PMLR %X Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach.
RIS
TY - CPAPER TI - Scalable Gaussian Process Classification via Expectation Propagation AU - Daniel Hernandez-Lobato AU - Jose Miguel Hernandez-Lobato BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics PY - 2016/05/02 DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-hernandez-lobato16 PB - PMLR SP - 168 DP - PMLR EP - 176 L1 - http://proceedings.mlr.press/v51/hernandez-lobato16.pdf UR - http://proceedings.mlr.press/v51/hernandez-lobato16.html AB - Variational methods have been recently considered for scaling the training process of Gaussian process classifiers to large datasets. As an alternative, we describe here how to train these classifiers efficiently using expectation propagation (EP). The proposed EP method allows to train Gaussian process classifiers on very large datasets, with millions of instances, that were out of the reach of previous implementations of EP. More precisely, it can be used for (i) training in a distributed fashion where the data instances are sent to different nodes in which the required computations are carried out, and for (ii) maximizing an estimate of the marginal likelihood using a stochastic approximation of the gradient. Several experiments involving large datasets show that the method described is competitive with the variational approach. ER -
APA
Hernandez-Lobato, D. & Hernandez-Lobato, J.M.. (2016). Scalable Gaussian Process Classification via Expectation Propagation. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in PMLR 51:168-176

Related Material