Efficient Multioutput Gaussian Processes through Variational Inducing Kernels

Mauricio Álvarez, David Luengo, Michalis Titsias, Neil D. Lawrence
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:25-32, 2010.

Abstract

Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way to construct such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Alvarez and Lawrence recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-alvarez10a, title = {Efficient Multioutput Gaussian Processes through Variational Inducing Kernels}, author = {Álvarez, Mauricio and Luengo, David and Titsias, Michalis and Lawrence, Neil D.}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {25--32}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/alvarez10a/alvarez10a.pdf}, url = {https://proceedings.mlr.press/v9/alvarez10a.html}, abstract = {Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way to construct such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Alvarez and Lawrence recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.} }
Endnote
%0 Conference Paper %T Efficient Multioutput Gaussian Processes through Variational Inducing Kernels %A Mauricio Álvarez %A David Luengo %A Michalis Titsias %A Neil D. Lawrence %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-alvarez10a %I PMLR %P 25--32 %U https://proceedings.mlr.press/v9/alvarez10a.html %V 9 %X Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way to construct such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Alvarez and Lawrence recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series.
RIS
TY - CPAPER TI - Efficient Multioutput Gaussian Processes through Variational Inducing Kernels AU - Mauricio Álvarez AU - David Luengo AU - Michalis Titsias AU - Neil D. Lawrence BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-alvarez10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 25 EP - 32 L1 - http://proceedings.mlr.press/v9/alvarez10a/alvarez10a.pdf UR - https://proceedings.mlr.press/v9/alvarez10a.html AB - Interest in multioutput kernel methods is increasing, whether under the guise of multitask learning, multisensor networks or structured output data. From the Gaussian process perspective a multioutput Mercer kernel is a covariance function over correlated output functions. One way to construct such kernels is based on convolution processes (CP). A key problem for this approach is efficient inference. Alvarez and Lawrence recently presented a sparse approximation for CPs that enabled efficient inference. In this paper, we extend this work in two directions: we introduce the concept of variational inducing functions to handle potential non-smooth functions involved in the kernel CP construction and we consider an alternative approach to approximate inference based on variational methods, extending the work by Titsias (2009) to the multiple output case. We demonstrate our approaches on prediction of school marks, compiler performance and financial time series. ER -
APA
Álvarez, M., Luengo, D., Titsias, M. & Lawrence, N.D.. (2010). Efficient Multioutput Gaussian Processes through Variational Inducing Kernels. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:25-32 Available from https://proceedings.mlr.press/v9/alvarez10a.html.

Related Material