Scalable MultiClass Gaussian Process Classification using Expectation Propagation
[edit]
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:35503559, 2017.
Abstract
This paper describes an expectation propagation (EP) method for multiclass classification with Gaussian processes that scales well to very large datasets. In such a method the estimate of the logmarginallikelihood involves a sum across the data instances. This enables efficient training using stochastic gradients and minibatches. When this type of training is used, the computational cost does not depend on the number of data instances N. Furthermore, extra assumptions in the approximate inference process make the memory cost independent of N. The consequence is that the proposed EP method can be used on datasets with millions of instances. We compare empirically this method with alternative approaches that approximate the required computations using variational inference. The results show that it performs similar or even better than these techniques, which sometimes give significantly worse predictive distributions in terms of the test loglikelihood. Besides this, the training process of the proposed approach also seems to converge in a smaller number of iterations.
Related Material


