Semi-Supervised Learning with Competitive Infection Models

Nir Rosenfeld, Amir Globerson
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:336-346, 2018.

Abstract

The goal in semi-supervised learning is to effectively combine labeled and unlabeled data. One way to do this is by encouraging smoothness across edges in a graph whose nodes correspond to input examples. In many graph-based methods, labels can be thought of as propagating over the graph, where the underlying propagation mechanism is based on random walks or on averaging dynamics. While theoretically elegant, these dynamics suffer from several drawbacks which can hurt predictive performance. Our goal in this work is to explore alternative mechanisms for propagating labels. In particular, we propose a method based on dynamic infection processes, where unlabeled nodes can be "infected" with the label of their already infected neighbors. Our algorithm is efficient and scalable, and an analysis of the underlying optimization objective reveals a surprising relation to other Laplacian approaches. We conclude with a thorough set of experiments across multiple benchmarks and various learning settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-rosenfeld18a, title = {Semi-Supervised Learning with Competitive Infection Models}, author = {Rosenfeld, Nir and Globerson, Amir}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {336--346}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/rosenfeld18a/rosenfeld18a.pdf}, url = {https://proceedings.mlr.press/v84/rosenfeld18a.html}, abstract = {The goal in semi-supervised learning is to effectively combine labeled and unlabeled data. One way to do this is by encouraging smoothness across edges in a graph whose nodes correspond to input examples. In many graph-based methods, labels can be thought of as propagating over the graph, where the underlying propagation mechanism is based on random walks or on averaging dynamics. While theoretically elegant, these dynamics suffer from several drawbacks which can hurt predictive performance. Our goal in this work is to explore alternative mechanisms for propagating labels. In particular, we propose a method based on dynamic infection processes, where unlabeled nodes can be "infected" with the label of their already infected neighbors. Our algorithm is efficient and scalable, and an analysis of the underlying optimization objective reveals a surprising relation to other Laplacian approaches. We conclude with a thorough set of experiments across multiple benchmarks and various learning settings. } }
Endnote
%0 Conference Paper %T Semi-Supervised Learning with Competitive Infection Models %A Nir Rosenfeld %A Amir Globerson %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-rosenfeld18a %I PMLR %P 336--346 %U https://proceedings.mlr.press/v84/rosenfeld18a.html %V 84 %X The goal in semi-supervised learning is to effectively combine labeled and unlabeled data. One way to do this is by encouraging smoothness across edges in a graph whose nodes correspond to input examples. In many graph-based methods, labels can be thought of as propagating over the graph, where the underlying propagation mechanism is based on random walks or on averaging dynamics. While theoretically elegant, these dynamics suffer from several drawbacks which can hurt predictive performance. Our goal in this work is to explore alternative mechanisms for propagating labels. In particular, we propose a method based on dynamic infection processes, where unlabeled nodes can be "infected" with the label of their already infected neighbors. Our algorithm is efficient and scalable, and an analysis of the underlying optimization objective reveals a surprising relation to other Laplacian approaches. We conclude with a thorough set of experiments across multiple benchmarks and various learning settings.
APA
Rosenfeld, N. & Globerson, A.. (2018). Semi-Supervised Learning with Competitive Infection Models. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:336-346 Available from https://proceedings.mlr.press/v84/rosenfeld18a.html.

Related Material