Revisiting Semi-Supervised Learning with Graph Embeddings

Zhilin Yang, William Cohen, Ruslan Salakhudinov
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:40-48, 2016.

Abstract

We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embeddings and input feature vectors, while in the inductive variant, the embeddings are defined as a parametric function of the feature vectors, so predictions can be made on instances not seen during training. On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, we show improved performance over many of the existing models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-yanga16, title = {Revisiting Semi-Supervised Learning with Graph Embeddings}, author = {Yang, Zhilin and Cohen, William and Salakhudinov, Ruslan}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {40--48}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/yanga16.pdf}, url = {https://proceedings.mlr.press/v48/yanga16.html}, abstract = {We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embeddings and input feature vectors, while in the inductive variant, the embeddings are defined as a parametric function of the feature vectors, so predictions can be made on instances not seen during training. On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, we show improved performance over many of the existing models.} }
Endnote
%0 Conference Paper %T Revisiting Semi-Supervised Learning with Graph Embeddings %A Zhilin Yang %A William Cohen %A Ruslan Salakhudinov %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-yanga16 %I PMLR %P 40--48 %U https://proceedings.mlr.press/v48/yanga16.html %V 48 %X We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embeddings and input feature vectors, while in the inductive variant, the embeddings are defined as a parametric function of the feature vectors, so predictions can be made on instances not seen during training. On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, we show improved performance over many of the existing models.
RIS
TY - CPAPER TI - Revisiting Semi-Supervised Learning with Graph Embeddings AU - Zhilin Yang AU - William Cohen AU - Ruslan Salakhudinov BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-yanga16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 40 EP - 48 L1 - http://proceedings.mlr.press/v48/yanga16.pdf UR - https://proceedings.mlr.press/v48/yanga16.html AB - We present a semi-supervised learning framework based on graph embeddings. Given a graph between instances, we train an embedding for each instance to jointly predict the class label and the neighborhood context in the graph. We develop both transductive and inductive variants of our method. In the transductive variant of our method, the class labels are determined by both the learned embeddings and input feature vectors, while in the inductive variant, the embeddings are defined as a parametric function of the feature vectors, so predictions can be made on instances not seen during training. On a large and diverse set of benchmark tasks, including text classification, distantly supervised entity extraction, and entity classification, we show improved performance over many of the existing models. ER -
APA
Yang, Z., Cohen, W. & Salakhudinov, R.. (2016). Revisiting Semi-Supervised Learning with Graph Embeddings. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:40-48 Available from https://proceedings.mlr.press/v48/yanga16.html.

Related Material