How to Learn a Graph from Smooth Signals

Vassilis Kalofolias
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:920-929, 2016.

Abstract

We propose a framework to learn the graph structure underlying a set of smooth signals. Given X∈\mathbbR^m\times n whose rows reside on the vertices of an unknown graph, we learn the edge weights w∈\mathbbR_+^m(m-1)/2 under the smoothness assumption that \rm trX^⊤LX is small, where L is the graph Laplacian. We show that the problem is a weighted \ell-1 minimization that leads to naturally sparse solutions. We prove that the standard graph construction with Gaussian weights w_ij = \exp(-\frac1σ^2\|x_i-x_j\|^2) and the previous state of the art are special cases of our framework. We propose a new model and present efficient, scalable primal-dual based algorithms both for this and the previous state of the art, to evaluate their performance on artificial and real data. The new model performs best in most settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-kalofolias16, title = {How to Learn a Graph from Smooth Signals}, author = {Kalofolias, Vassilis}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {920--929}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/kalofolias16.pdf}, url = {http://proceedings.mlr.press/v51/kalofolias16.html}, abstract = {We propose a framework to learn the graph structure underlying a set of smooth signals. Given X∈\mathbbR^m\times n whose rows reside on the vertices of an unknown graph, we learn the edge weights w∈\mathbbR_+^m(m-1)/2 under the smoothness assumption that \rm trX^⊤LX is small, where L is the graph Laplacian. We show that the problem is a weighted \ell-1 minimization that leads to naturally sparse solutions. We prove that the standard graph construction with Gaussian weights w_ij = \exp(-\frac1σ^2\|x_i-x_j\|^2) and the previous state of the art are special cases of our framework. We propose a new model and present efficient, scalable primal-dual based algorithms both for this and the previous state of the art, to evaluate their performance on artificial and real data. The new model performs best in most settings.} }
Endnote
%0 Conference Paper %T How to Learn a Graph from Smooth Signals %A Vassilis Kalofolias %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-kalofolias16 %I PMLR %P 920--929 %U http://proceedings.mlr.press/v51/kalofolias16.html %V 51 %X We propose a framework to learn the graph structure underlying a set of smooth signals. Given X∈\mathbbR^m\times n whose rows reside on the vertices of an unknown graph, we learn the edge weights w∈\mathbbR_+^m(m-1)/2 under the smoothness assumption that \rm trX^⊤LX is small, where L is the graph Laplacian. We show that the problem is a weighted \ell-1 minimization that leads to naturally sparse solutions. We prove that the standard graph construction with Gaussian weights w_ij = \exp(-\frac1σ^2\|x_i-x_j\|^2) and the previous state of the art are special cases of our framework. We propose a new model and present efficient, scalable primal-dual based algorithms both for this and the previous state of the art, to evaluate their performance on artificial and real data. The new model performs best in most settings.
RIS
TY - CPAPER TI - How to Learn a Graph from Smooth Signals AU - Vassilis Kalofolias BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-kalofolias16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 920 EP - 929 L1 - http://proceedings.mlr.press/v51/kalofolias16.pdf UR - http://proceedings.mlr.press/v51/kalofolias16.html AB - We propose a framework to learn the graph structure underlying a set of smooth signals. Given X∈\mathbbR^m\times n whose rows reside on the vertices of an unknown graph, we learn the edge weights w∈\mathbbR_+^m(m-1)/2 under the smoothness assumption that \rm trX^⊤LX is small, where L is the graph Laplacian. We show that the problem is a weighted \ell-1 minimization that leads to naturally sparse solutions. We prove that the standard graph construction with Gaussian weights w_ij = \exp(-\frac1σ^2\|x_i-x_j\|^2) and the previous state of the art are special cases of our framework. We propose a new model and present efficient, scalable primal-dual based algorithms both for this and the previous state of the art, to evaluate their performance on artificial and real data. The new model performs best in most settings. ER -
APA
Kalofolias, V.. (2016). How to Learn a Graph from Smooth Signals. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:920-929 Available from http://proceedings.mlr.press/v51/kalofolias16.html.

Related Material