Extended and Unscented Kitchen Sinks

Edwin Bonilla, Daniel Steinberg, Alistair Reid
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1651-1659, 2016.

Abstract

We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-bonilla16, title = {Extended and Unscented Kitchen Sinks}, author = {Bonilla, Edwin and Steinberg, Daniel and Reid, Alistair}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1651--1659}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/bonilla16.pdf}, url = {https://proceedings.mlr.press/v48/bonilla16.html}, abstract = {We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.} }
Endnote
%0 Conference Paper %T Extended and Unscented Kitchen Sinks %A Edwin Bonilla %A Daniel Steinberg %A Alistair Reid %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-bonilla16 %I PMLR %P 1651--1659 %U https://proceedings.mlr.press/v48/bonilla16.html %V 48 %X We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.
RIS
TY - CPAPER TI - Extended and Unscented Kitchen Sinks AU - Edwin Bonilla AU - Daniel Steinberg AU - Alistair Reid BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-bonilla16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1651 EP - 1659 L1 - http://proceedings.mlr.press/v48/bonilla16.pdf UR - https://proceedings.mlr.press/v48/bonilla16.html AB - We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem. ER -
APA
Bonilla, E., Steinberg, D. & Reid, A.. (2016). Extended and Unscented Kitchen Sinks. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1651-1659 Available from https://proceedings.mlr.press/v48/bonilla16.html.

Related Material