Graph Sparsification Approaches for Laplacian Smoothing

Veeru Sadhanala, Yu-Xiang Wang, Ryan Tibshirani
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1250-1259, 2016.

Abstract

Given a statistical estimation problem where regularization is performed according to the structure of a large, dense graph G, we consider fitting the statistical estimate using a \it sparsified surrogate graph \mathbfG, which shares the vertices of G but has far fewer edges, and is thus more tractable to work with computationally. We examine three types of sparsification: spectral sparsification, which can be seen as the result of sampling edges from the graph with probabilities proportional to their effective resistances, and two simpler sparsifiers, which sample edges uniformly from the graph, either globally or locally. We provide strong theoretical and experimental results, demonstrating that sparsification before estimation can give statistically sensible solutions, with significant computational savings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-sadhanala16, title = {Graph Sparsification Approaches for Laplacian Smoothing}, author = {Sadhanala, Veeru and Wang, Yu-Xiang and Tibshirani, Ryan}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {1250--1259}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/sadhanala16.pdf}, url = {https://proceedings.mlr.press/v51/sadhanala16.html}, abstract = {Given a statistical estimation problem where regularization is performed according to the structure of a large, dense graph G, we consider fitting the statistical estimate using a \it sparsified surrogate graph \mathbfG, which shares the vertices of G but has far fewer edges, and is thus more tractable to work with computationally. We examine three types of sparsification: spectral sparsification, which can be seen as the result of sampling edges from the graph with probabilities proportional to their effective resistances, and two simpler sparsifiers, which sample edges uniformly from the graph, either globally or locally. We provide strong theoretical and experimental results, demonstrating that sparsification before estimation can give statistically sensible solutions, with significant computational savings.} }
Endnote
%0 Conference Paper %T Graph Sparsification Approaches for Laplacian Smoothing %A Veeru Sadhanala %A Yu-Xiang Wang %A Ryan Tibshirani %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-sadhanala16 %I PMLR %P 1250--1259 %U https://proceedings.mlr.press/v51/sadhanala16.html %V 51 %X Given a statistical estimation problem where regularization is performed according to the structure of a large, dense graph G, we consider fitting the statistical estimate using a \it sparsified surrogate graph \mathbfG, which shares the vertices of G but has far fewer edges, and is thus more tractable to work with computationally. We examine three types of sparsification: spectral sparsification, which can be seen as the result of sampling edges from the graph with probabilities proportional to their effective resistances, and two simpler sparsifiers, which sample edges uniformly from the graph, either globally or locally. We provide strong theoretical and experimental results, demonstrating that sparsification before estimation can give statistically sensible solutions, with significant computational savings.
RIS
TY - CPAPER TI - Graph Sparsification Approaches for Laplacian Smoothing AU - Veeru Sadhanala AU - Yu-Xiang Wang AU - Ryan Tibshirani BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-sadhanala16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 1250 EP - 1259 L1 - http://proceedings.mlr.press/v51/sadhanala16.pdf UR - https://proceedings.mlr.press/v51/sadhanala16.html AB - Given a statistical estimation problem where regularization is performed according to the structure of a large, dense graph G, we consider fitting the statistical estimate using a \it sparsified surrogate graph \mathbfG, which shares the vertices of G but has far fewer edges, and is thus more tractable to work with computationally. We examine three types of sparsification: spectral sparsification, which can be seen as the result of sampling edges from the graph with probabilities proportional to their effective resistances, and two simpler sparsifiers, which sample edges uniformly from the graph, either globally or locally. We provide strong theoretical and experimental results, demonstrating that sparsification before estimation can give statistically sensible solutions, with significant computational savings. ER -
APA
Sadhanala, V., Wang, Y. & Tibshirani, R.. (2016). Graph Sparsification Approaches for Laplacian Smoothing. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:1250-1259 Available from https://proceedings.mlr.press/v51/sadhanala16.html.

Related Material