Stochastic Iterative Hard Thresholding for Graphstructured Sparsity Optimization
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:75637573, 2019.
Abstract
Stochastic optimization algorithms update models with cheap periteration costs sequentially, which makes them amenable for largescale data analysis. Such algorithms have been widely studied for structured sparse models where the sparsity information is very specific, e.g., convex sparsityinducing norms or $\ell^0$norm. However, these norms cannot be directly applied to the problem of complex (nonconvex) graphstructured sparsity models, which have important application in disease outbreak and social networks, etc. In this paper, we propose a stochastic gradientbased method for solving graphstructured sparsity constraint problems, not restricted to the least square loss. We prove that our algorithm enjoys a linear convergence up to a constant error, which is competitive with the counterparts in the batch learning setting. We conduct extensive experiments to show the efficiency and effectiveness of the proposed algorithms.
Related Material


