Localizing and Amortizing: Efficient Inference for Gaussian Processes

Linfeng Liu, Liping Liu
Proceedings of The 12th Asian Conference on Machine Learning, PMLR 129:823-836, 2020.

Abstract

The inference of Gaussian Processes concerns the distribution of the underlying function given observed data points. GP inference based on local ranges of data points is able to capture fine-scale correlations and allow fine-grained decomposition of the computation. Following this direction, we propose a new inference model that considers the correlations and observations of the K nearest neighbors for the inference at a data point. Compared with previous works, we also eliminate the data ordering prerequisite to simplify the inference process. Additionally, the inference task is decomposed to small subtasks with several technique innovations, making our model well suits the stochastic optimization. Since the decomposed small subtasks have the same structure, we further speed up the inference procedure with amortized inference. Our model runs efficiently and achieves good performances on several benchmark tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v129-liu20b, title = {Localizing and Amortizing: Efficient Inference for Gaussian Processes}, author = {Liu, Linfeng and Liu, Liping}, booktitle = {Proceedings of The 12th Asian Conference on Machine Learning}, pages = {823--836}, year = {2020}, editor = {Pan, Sinno Jialin and Sugiyama, Masashi}, volume = {129}, series = {Proceedings of Machine Learning Research}, month = {18--20 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v129/liu20b/liu20b.pdf}, url = {https://proceedings.mlr.press/v129/liu20b.html}, abstract = {The inference of Gaussian Processes concerns the distribution of the underlying function given observed data points. GP inference based on local ranges of data points is able to capture fine-scale correlations and allow fine-grained decomposition of the computation. Following this direction, we propose a new inference model that considers the correlations and observations of the K nearest neighbors for the inference at a data point. Compared with previous works, we also eliminate the data ordering prerequisite to simplify the inference process. Additionally, the inference task is decomposed to small subtasks with several technique innovations, making our model well suits the stochastic optimization. Since the decomposed small subtasks have the same structure, we further speed up the inference procedure with amortized inference. Our model runs efficiently and achieves good performances on several benchmark tasks.} }
Endnote
%0 Conference Paper %T Localizing and Amortizing: Efficient Inference for Gaussian Processes %A Linfeng Liu %A Liping Liu %B Proceedings of The 12th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Sinno Jialin Pan %E Masashi Sugiyama %F pmlr-v129-liu20b %I PMLR %P 823--836 %U https://proceedings.mlr.press/v129/liu20b.html %V 129 %X The inference of Gaussian Processes concerns the distribution of the underlying function given observed data points. GP inference based on local ranges of data points is able to capture fine-scale correlations and allow fine-grained decomposition of the computation. Following this direction, we propose a new inference model that considers the correlations and observations of the K nearest neighbors for the inference at a data point. Compared with previous works, we also eliminate the data ordering prerequisite to simplify the inference process. Additionally, the inference task is decomposed to small subtasks with several technique innovations, making our model well suits the stochastic optimization. Since the decomposed small subtasks have the same structure, we further speed up the inference procedure with amortized inference. Our model runs efficiently and achieves good performances on several benchmark tasks.
APA
Liu, L. & Liu, L.. (2020). Localizing and Amortizing: Efficient Inference for Gaussian Processes. Proceedings of The 12th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 129:823-836 Available from https://proceedings.mlr.press/v129/liu20b.html.

Related Material