Non-Negative Semi-Supervised Learning

Changhu Wang, Shuicheng Yan, Lei Zhang, Hongjiang Zhang
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:575-582, 2009.

Abstract

The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-wang09a, title = {Non-Negative Semi-Supervised Learning}, author = {Changhu Wang and Shuicheng Yan and Lei Zhang and Hongjiang Zhang}, pages = {575--582}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wang09a/wang09a.pdf}, url = {http://proceedings.mlr.press/v5/wang09a.html}, abstract = {The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.} }
Endnote
%0 Conference Paper %T Non-Negative Semi-Supervised Learning %A Changhu Wang %A Shuicheng Yan %A Lei Zhang %A Hongjiang Zhang %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-wang09a %I PMLR %J Proceedings of Machine Learning Research %P 575--582 %U http://proceedings.mlr.press %V 5 %W PMLR %X The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.
RIS
TY - CPAPER TI - Non-Negative Semi-Supervised Learning AU - Changhu Wang AU - Shuicheng Yan AU - Lei Zhang AU - Hongjiang Zhang BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-wang09a PB - PMLR SP - 575 DP - PMLR EP - 582 L1 - http://proceedings.mlr.press/v5/wang09a/wang09a.pdf UR - http://proceedings.mlr.press/v5/wang09a.html AB - The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions. ER -
APA
Wang, C., Yan, S., Zhang, L. & Zhang, H.. (2009). Non-Negative Semi-Supervised Learning. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:575-582

Related Material