Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation

Théo Galy-Fajou, Florian Wenzel, Christian Donner, Manfred Opper
Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, PMLR 115:755-765, 2020.

Abstract

We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. The new likelihood has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient latent variable augmentation. The augmented model has the advantage that it is conditionally conjugate leading to a fast variational inference method via block coordinate ascent updates. Previous approaches suffered from a trade-off between uncertainty calibration and speed. Our experiments show that our method leads to well-calibrated uncertainty estimates and competitive predictive performance while being up to two orders faster than the state of the art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v115-galy-fajou20a, title = {Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation}, author = {Galy-Fajou, Th{\'{e}}o and Wenzel, Florian and Donner, Christian and Opper, Manfred}, booktitle = {Proceedings of The 35th Uncertainty in Artificial Intelligence Conference}, pages = {755--765}, year = {2020}, editor = {Adams, Ryan P. and Gogate, Vibhav}, volume = {115}, series = {Proceedings of Machine Learning Research}, month = {22--25 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v115/galy-fajou20a/galy-fajou20a.pdf}, url = {https://proceedings.mlr.press/v115/galy-fajou20a.html}, abstract = {We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. The new likelihood has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient latent variable augmentation. The augmented model has the advantage that it is conditionally conjugate leading to a fast variational inference method via block coordinate ascent updates. Previous approaches suffered from a trade-off between uncertainty calibration and speed. Our experiments show that our method leads to well-calibrated uncertainty estimates and competitive predictive performance while being up to two orders faster than the state of the art.} }
Endnote
%0 Conference Paper %T Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation %A Théo Galy-Fajou %A Florian Wenzel %A Christian Donner %A Manfred Opper %B Proceedings of The 35th Uncertainty in Artificial Intelligence Conference %C Proceedings of Machine Learning Research %D 2020 %E Ryan P. Adams %E Vibhav Gogate %F pmlr-v115-galy-fajou20a %I PMLR %P 755--765 %U https://proceedings.mlr.press/v115/galy-fajou20a.html %V 115 %X We propose a new scalable multi-class Gaussian process classification approach building on a novel modified softmax likelihood function. The new likelihood has two benefits: it leads to well-calibrated uncertainty estimates and allows for an efficient latent variable augmentation. The augmented model has the advantage that it is conditionally conjugate leading to a fast variational inference method via block coordinate ascent updates. Previous approaches suffered from a trade-off between uncertainty calibration and speed. Our experiments show that our method leads to well-calibrated uncertainty estimates and competitive predictive performance while being up to two orders faster than the state of the art.
APA
Galy-Fajou, T., Wenzel, F., Donner, C. & Opper, M.. (2020). Multi-Class Gaussian Process Classification Made Conjugate: Efficient Inference via Data Augmentation. Proceedings of The 35th Uncertainty in Artificial Intelligence Conference, in Proceedings of Machine Learning Research 115:755-765 Available from https://proceedings.mlr.press/v115/galy-fajou20a.html.

Related Material