Memory-Based Dual Gaussian Processes for Sequential Learning

Paul Edmund Chang, Prakhar Verma, S. T. John, Arno Solin, Mohammad Emtiyaz Khan
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:4035-4054, 2023.

Abstract

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-chang23a, title = {Memory-Based Dual {G}aussian Processes for Sequential Learning}, author = {Chang, Paul Edmund and Verma, Prakhar and John, S. T. and Solin, Arno and Khan, Mohammad Emtiyaz}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {4035--4054}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/chang23a/chang23a.pdf}, url = {https://proceedings.mlr.press/v202/chang23a.html}, abstract = {Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.} }
Endnote
%0 Conference Paper %T Memory-Based Dual Gaussian Processes for Sequential Learning %A Paul Edmund Chang %A Prakhar Verma %A S. T. John %A Arno Solin %A Mohammad Emtiyaz Khan %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-chang23a %I PMLR %P 4035--4054 %U https://proceedings.mlr.press/v202/chang23a.html %V 202 %X Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.
APA
Chang, P.E., Verma, P., John, S.T., Solin, A. & Khan, M.E.. (2023). Memory-Based Dual Gaussian Processes for Sequential Learning. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:4035-4054 Available from https://proceedings.mlr.press/v202/chang23a.html.

Related Material