Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method

Chenzi Zhang, Shuguang Hu, Zhihao Gavin Tang, T-H. Hubert Chan
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:4026-4034, 2017.

Abstract

We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-zhang17d, title = {Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method}, author = {Chenzi Zhang and Shuguang Hu and Zhihao Gavin Tang and T-H. Hubert Chan}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {4026--4034}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/zhang17d/zhang17d.pdf}, url = {https://proceedings.mlr.press/v70/zhang17d.html}, abstract = {We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.} }
Endnote
%0 Conference Paper %T Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method %A Chenzi Zhang %A Shuguang Hu %A Zhihao Gavin Tang %A T-H. Hubert Chan %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-zhang17d %I PMLR %P 4026--4034 %U https://proceedings.mlr.press/v70/zhang17d.html %V 70 %X We revisit semi-supervised learning on hypergraphs. Same as previous approaches, our method uses a convex program whose objective function is not everywhere differentiable. We exploit the non-uniqueness of the optimal solutions, and consider confidence intervals which give the exact ranges that unlabeled vertices take in any optimal solution. Moreover, we give a much simpler approach for solving the convex program based on the subgradient method. Our experiments on real-world datasets confirm that our confidence interval approach on hypergraphs outperforms existing methods, and our sub-gradient method gives faster running times when the number of vertices is much larger than the number of edges.
APA
Zhang, C., Hu, S., Tang, Z.G. & Chan, T.H.. (2017). Re-revisiting Learning on Hypergraphs: Confidence Interval and Subgradient Method. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:4026-4034 Available from https://proceedings.mlr.press/v70/zhang17d.html.

Related Material