Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process

Simon Rogers, Mark Girolami
Gaussian Processes in Practice, PMLR 1:17-32, 2007.

Abstract

Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v1-rogers07a, title = {Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process}, author = {Rogers, Simon and Girolami, Mark}, booktitle = {Gaussian Processes in Practice}, pages = {17--32}, year = {2007}, editor = {Lawrence, Neil D. and Schwaighofer, Anton and Quiñonero Candela, Joaquin}, volume = {1}, series = {Proceedings of Machine Learning Research}, address = {Bletchley Park, UK}, month = {12--13 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v1/rogers07a/rogers07a.pdf}, url = {https://proceedings.mlr.press/v1/rogers07a.html}, abstract = {Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.} }
Endnote
%0 Conference Paper %T Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process %A Simon Rogers %A Mark Girolami %B Gaussian Processes in Practice %C Proceedings of Machine Learning Research %D 2007 %E Neil D. Lawrence %E Anton Schwaighofer %E Joaquin Quiñonero Candela %F pmlr-v1-rogers07a %I PMLR %P 17--32 %U https://proceedings.mlr.press/v1/rogers07a.html %V 1 %X Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.
RIS
TY - CPAPER TI - Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process AU - Simon Rogers AU - Mark Girolami BT - Gaussian Processes in Practice DA - 2007/03/11 ED - Neil D. Lawrence ED - Anton Schwaighofer ED - Joaquin Quiñonero Candela ID - pmlr-v1-rogers07a PB - PMLR DP - Proceedings of Machine Learning Research VL - 1 SP - 17 EP - 32 L1 - http://proceedings.mlr.press/v1/rogers07a/rogers07a.pdf UR - https://proceedings.mlr.press/v1/rogers07a.html AB - Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes. ER -
APA
Rogers, S. & Girolami, M.. (2007). Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process. Gaussian Processes in Practice, in Proceedings of Machine Learning Research 1:17-32 Available from https://proceedings.mlr.press/v1/rogers07a.html.

Related Material