[edit]
Squared-loss Mutual Information Regularization: A Novel Information-theoretic Approach to Semi-supervised Learning
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):10-18, 2013.
Abstract
We propose squared-loss mutual information regularization (SMIR) for multi-class probabilistic classification, following the information maximization principle. SMIR is convex under mild conditions and thus improves the nonconvexity of mutual information regularization. It offers all of the following four abilities to semi-supervised algorithms: Analytical solution, out-of-sample/multi-class classification, and probabilistic output. Furthermore, novel generalization error bounds are derived. Experiments show SMIR compares favorably with state-of-the-art methods.