A Nearly-Linear Time Framework for Graph-Structured Sparsity

Chinmay Hegde, Piotr Indyk, Ludwig Schmidt
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:928-937, 2015.

Abstract

We introduce a framework for sparsity structures defined via graphs. Our approach is flexible and generalizes several previously studied sparsity models. Moreover, we provide efficient projection algorithms for our sparsity model that run in nearly-linear time. In the context of sparse recovery, we show that our framework achieves an information-theoretically optimal sample complexity for a wide range of parameters. We complement our theoretical analysis with experiments demonstrating that our algorithms improve on prior work also in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-hegde15, title = {A Nearly-Linear Time Framework for Graph-Structured Sparsity}, author = {Hegde, Chinmay and Indyk, Piotr and Schmidt, Ludwig}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {928--937}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/hegde15.pdf}, url = { http://proceedings.mlr.press/v37/hegde15.html }, abstract = {We introduce a framework for sparsity structures defined via graphs. Our approach is flexible and generalizes several previously studied sparsity models. Moreover, we provide efficient projection algorithms for our sparsity model that run in nearly-linear time. In the context of sparse recovery, we show that our framework achieves an information-theoretically optimal sample complexity for a wide range of parameters. We complement our theoretical analysis with experiments demonstrating that our algorithms improve on prior work also in practice.} }
Endnote
%0 Conference Paper %T A Nearly-Linear Time Framework for Graph-Structured Sparsity %A Chinmay Hegde %A Piotr Indyk %A Ludwig Schmidt %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-hegde15 %I PMLR %P 928--937 %U http://proceedings.mlr.press/v37/hegde15.html %V 37 %X We introduce a framework for sparsity structures defined via graphs. Our approach is flexible and generalizes several previously studied sparsity models. Moreover, we provide efficient projection algorithms for our sparsity model that run in nearly-linear time. In the context of sparse recovery, we show that our framework achieves an information-theoretically optimal sample complexity for a wide range of parameters. We complement our theoretical analysis with experiments demonstrating that our algorithms improve on prior work also in practice.
RIS
TY - CPAPER TI - A Nearly-Linear Time Framework for Graph-Structured Sparsity AU - Chinmay Hegde AU - Piotr Indyk AU - Ludwig Schmidt BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-hegde15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 928 EP - 937 L1 - http://proceedings.mlr.press/v37/hegde15.pdf UR - http://proceedings.mlr.press/v37/hegde15.html AB - We introduce a framework for sparsity structures defined via graphs. Our approach is flexible and generalizes several previously studied sparsity models. Moreover, we provide efficient projection algorithms for our sparsity model that run in nearly-linear time. In the context of sparse recovery, we show that our framework achieves an information-theoretically optimal sample complexity for a wide range of parameters. We complement our theoretical analysis with experiments demonstrating that our algorithms improve on prior work also in practice. ER -
APA
Hegde, C., Indyk, P. & Schmidt, L.. (2015). A Nearly-Linear Time Framework for Graph-Structured Sparsity. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:928-937 Available from http://proceedings.mlr.press/v37/hegde15.html .

Related Material