Extended and Unscented Kitchen Sinks


Edwin Bonilla, Daniel Steinberg, Alistair Reid ;
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1651-1659, 2016.


We propose a scalable multiple-output generalization of unscented and extended Gaussian processes. These algorithms have been designed to handle general likelihood models by linearizing them using a Taylor series or the Unscented Transform in a variational inference framework. We build upon random feature approximations of Gaussian process covariance functions and show that, on small-scale single-task problems, our methods can attain similar performance as the original algorithms while having less computational cost. We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.

Related Material