Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization

Baojian Zhou, Feng Chen, Yiming Ying
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7563-7573, 2019.

Abstract

Stochastic optimization algorithms update models with cheap per-iteration costs sequentially, which makes them amenable for large-scale data analysis. Such algorithms have been widely studied for structured sparse models where the sparsity information is very specific, e.g., convex sparsity-inducing norms or $\ell^0$-norm. However, these norms cannot be directly applied to the problem of complex (non-convex) graph-structured sparsity models, which have important application in disease outbreak and social networks, etc. In this paper, we propose a stochastic gradient-based method for solving graph-structured sparsity constraint problems, not restricted to the least square loss. We prove that our algorithm enjoys a linear convergence up to a constant error, which is competitive with the counterparts in the batch learning setting. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-zhou19a, title = {Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization}, author = {Zhou, Baojian and Chen, Feng and Ying, Yiming}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7563--7573}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/zhou19a/zhou19a.pdf}, url = {https://proceedings.mlr.press/v97/zhou19a.html}, abstract = {Stochastic optimization algorithms update models with cheap per-iteration costs sequentially, which makes them amenable for large-scale data analysis. Such algorithms have been widely studied for structured sparse models where the sparsity information is very specific, e.g., convex sparsity-inducing norms or $\ell^0$-norm. However, these norms cannot be directly applied to the problem of complex (non-convex) graph-structured sparsity models, which have important application in disease outbreak and social networks, etc. In this paper, we propose a stochastic gradient-based method for solving graph-structured sparsity constraint problems, not restricted to the least square loss. We prove that our algorithm enjoys a linear convergence up to a constant error, which is competitive with the counterparts in the batch learning setting. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.} }
Endnote
%0 Conference Paper %T Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization %A Baojian Zhou %A Feng Chen %A Yiming Ying %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-zhou19a %I PMLR %P 7563--7573 %U https://proceedings.mlr.press/v97/zhou19a.html %V 97 %X Stochastic optimization algorithms update models with cheap per-iteration costs sequentially, which makes them amenable for large-scale data analysis. Such algorithms have been widely studied for structured sparse models where the sparsity information is very specific, e.g., convex sparsity-inducing norms or $\ell^0$-norm. However, these norms cannot be directly applied to the problem of complex (non-convex) graph-structured sparsity models, which have important application in disease outbreak and social networks, etc. In this paper, we propose a stochastic gradient-based method for solving graph-structured sparsity constraint problems, not restricted to the least square loss. We prove that our algorithm enjoys a linear convergence up to a constant error, which is competitive with the counterparts in the batch learning setting. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.
APA
Zhou, B., Chen, F. & Ying, Y.. (2019). Stochastic Iterative Hard Thresholding for Graph-structured Sparsity Optimization. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7563-7573 Available from https://proceedings.mlr.press/v97/zhou19a.html.

Related Material