Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process

Simon Rogers, Mark Girolami
; Gaussian Processes in Practice, PMLR 1:17-32, 2007.

Abstract

Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v1-rogers07a, title = {Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process}, author = {Simon Rogers and Mark Girolami}, booktitle = {Gaussian Processes in Practice}, pages = {17--32}, year = {2007}, editor = {Neil D. Lawrence and Anton Schwaighofer and Joaquin Quiñonero Candela}, volume = {1}, series = {Proceedings of Machine Learning Research}, address = {Bletchley Park, UK}, month = {12--13 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v1/rogers07a/rogers07a.pdf}, url = {http://proceedings.mlr.press/v1/rogers07a.html}, abstract = {Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.} }
Endnote
%0 Conference Paper %T Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process %A Simon Rogers %A Mark Girolami %B Gaussian Processes in Practice %C Proceedings of Machine Learning Research %D 2007 %E Neil D. Lawrence %E Anton Schwaighofer %E Joaquin Quiñonero Candela %F pmlr-v1-rogers07a %I PMLR %J Proceedings of Machine Learning Research %P 17--32 %U http://proceedings.mlr.press %V 1 %W PMLR %X Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes.
RIS
TY - CPAPER TI - Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process AU - Simon Rogers AU - Mark Girolami BT - Gaussian Processes in Practice PY - 2007/03/11 DA - 2007/03/11 ED - Neil D. Lawrence ED - Anton Schwaighofer ED - Joaquin Quiñonero Candela ID - pmlr-v1-rogers07a PB - PMLR SP - 17 DP - PMLR EP - 32 L1 - http://proceedings.mlr.press/v1/rogers07a/rogers07a.pdf UR - http://proceedings.mlr.press/v1/rogers07a.html AB - Recently, the null category noise model has been proposed as a simple and elegant solution to the problem of incorporating unlabeled data into a Gaussian process (GP) classification model. In this paper, we show how this binary likelihood model can be generalised to the multi-class setting through the use of the multinomial probit GP classifier. We present a Gibbs sampling scheme for sampling the GP parameters and also derive a more efficient variational updating scheme. We find that the performance improvement is roughly consistent with that observed in binary classification and that there is no significant difference in classification performance between the Gibbs sampling and variational schemes. ER -
APA
Rogers, S. & Girolami, M.. (2007). Multi-class Semi-supervised Learning with the $\epsilon$-truncated Multinomial Probit Gaussian Process. Gaussian Processes in Practice, in PMLR 1:17-32

Related Material