Graph-based Semi-supervised Learning: Realizing Pointwise Smoothness Probabilistically

Yuan Fang, Kevin Chang, Hady Lauw
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):406-414, 2014.

Abstract

As the central notion in semi-supervised learning, smoothness is often realized on a graph representation of the data. In this paper, we study two complementary dimensions of smoothness: its pointwise nature and probabilistic modeling. While no existing graph-based work exploits them in conjunction, we encompass both in a novel framework of Probabilistic Graph-based Pointwise Smoothness (PGP), building upon two foundational models of data closeness and label coupling. This new form of smoothness axiomatizes a set of probability constraints, which ultimately enables class prediction. Theoretically, we provide an error and robustness analysis of PGP. Empirically, we conduct extensive experiments to show the advantages of PGP.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-fang14, title = {Graph-based Semi-supervised Learning: Realizing Pointwise Smoothness Probabilistically}, author = {Fang, Yuan and Chang, Kevin and Lauw, Hady}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {406--414}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/fang14.pdf}, url = {https://proceedings.mlr.press/v32/fang14.html}, abstract = {As the central notion in semi-supervised learning, smoothness is often realized on a graph representation of the data. In this paper, we study two complementary dimensions of smoothness: its pointwise nature and probabilistic modeling. While no existing graph-based work exploits them in conjunction, we encompass both in a novel framework of Probabilistic Graph-based Pointwise Smoothness (PGP), building upon two foundational models of data closeness and label coupling. This new form of smoothness axiomatizes a set of probability constraints, which ultimately enables class prediction. Theoretically, we provide an error and robustness analysis of PGP. Empirically, we conduct extensive experiments to show the advantages of PGP.} }
Endnote
%0 Conference Paper %T Graph-based Semi-supervised Learning: Realizing Pointwise Smoothness Probabilistically %A Yuan Fang %A Kevin Chang %A Hady Lauw %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-fang14 %I PMLR %P 406--414 %U https://proceedings.mlr.press/v32/fang14.html %V 32 %N 2 %X As the central notion in semi-supervised learning, smoothness is often realized on a graph representation of the data. In this paper, we study two complementary dimensions of smoothness: its pointwise nature and probabilistic modeling. While no existing graph-based work exploits them in conjunction, we encompass both in a novel framework of Probabilistic Graph-based Pointwise Smoothness (PGP), building upon two foundational models of data closeness and label coupling. This new form of smoothness axiomatizes a set of probability constraints, which ultimately enables class prediction. Theoretically, we provide an error and robustness analysis of PGP. Empirically, we conduct extensive experiments to show the advantages of PGP.
RIS
TY - CPAPER TI - Graph-based Semi-supervised Learning: Realizing Pointwise Smoothness Probabilistically AU - Yuan Fang AU - Kevin Chang AU - Hady Lauw BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-fang14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 406 EP - 414 L1 - http://proceedings.mlr.press/v32/fang14.pdf UR - https://proceedings.mlr.press/v32/fang14.html AB - As the central notion in semi-supervised learning, smoothness is often realized on a graph representation of the data. In this paper, we study two complementary dimensions of smoothness: its pointwise nature and probabilistic modeling. While no existing graph-based work exploits them in conjunction, we encompass both in a novel framework of Probabilistic Graph-based Pointwise Smoothness (PGP), building upon two foundational models of data closeness and label coupling. This new form of smoothness axiomatizes a set of probability constraints, which ultimately enables class prediction. Theoretically, we provide an error and robustness analysis of PGP. Empirically, we conduct extensive experiments to show the advantages of PGP. ER -
APA
Fang, Y., Chang, K. & Lauw, H.. (2014). Graph-based Semi-supervised Learning: Realizing Pointwise Smoothness Probabilistically. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):406-414 Available from https://proceedings.mlr.press/v32/fang14.html.

Related Material