Gibbs Max-Margin Topic Models with Fast Sampling Algorithms

[edit]

Jun Zhu, Ning Chen, Hugh Perkins, Bo Zhang ;
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(1):124-132, 2013.

Abstract

Existing max-margin supervised topic models rely on an iterative procedure to solve multiple latent SVM subproblems with additional mean-field assumptions on the desired posterior distributions. This paper presents Gibbs max-margin supervised topic models by minimizing an expected margin loss, an upper bound of the existing margin loss derived from an expected prediction rule. By introducing augmented variables, we develop simple and fast Gibbs sampling algorithms with no restricting assumptions and no need to solve SVM subproblems for both classification and regression. Empirical results demonstrate significant improvements on time efficiency. The classification performance is also significantly improved over competitors.

Related Material