Algorithms for Lipschitz Learning on Graphs

Rasmus Kyng, Anup Rao, Sushant Sachdeva, Daniel A. Spielman
; Proceedings of The 28th Conference on Learning Theory, PMLR 40:1190-1223, 2015.

Abstract

We develop fast algorithms for solving regression problems on graphs where one is given the value of a function at some vertices, and must find its smoothest possible extension to all vertices. The extension we compute is the absolutely minimal Lipschitz extension, and is the limit for large p of p-Laplacian regularization. We present an algorithm that computes a minimal Lipschitz extension in expected linear time, and an algorithm that computes an absolutely minimal Lipschitz extension in expected time \widetildeO (m n). The latter algorithm has variants that seem to run much faster in practice. These extensions are particularly amenable to regularization: we can perform l_0-regularization on the given values in polynomial time and l_1-regularization on the initial function values and on graph edge weights in time \widetildeO (m^3/2). Our definitions and algorithms naturally extend to directed graphs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v40-Kyng15, title = {Algorithms for Lipschitz Learning on Graphs}, author = {Rasmus Kyng and Anup Rao and Sushant Sachdeva and Daniel A. Spielman}, booktitle = {Proceedings of The 28th Conference on Learning Theory}, pages = {1190--1223}, year = {2015}, editor = {Peter Grünwald and Elad Hazan and Satyen Kale}, volume = {40}, series = {Proceedings of Machine Learning Research}, address = {Paris, France}, month = {03--06 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v40/Kyng15.pdf}, url = {http://proceedings.mlr.press/v40/Kyng15.html}, abstract = {We develop fast algorithms for solving regression problems on graphs where one is given the value of a function at some vertices, and must find its smoothest possible extension to all vertices. The extension we compute is the absolutely minimal Lipschitz extension, and is the limit for large p of p-Laplacian regularization. We present an algorithm that computes a minimal Lipschitz extension in expected linear time, and an algorithm that computes an absolutely minimal Lipschitz extension in expected time \widetildeO (m n). The latter algorithm has variants that seem to run much faster in practice. These extensions are particularly amenable to regularization: we can perform l_0-regularization on the given values in polynomial time and l_1-regularization on the initial function values and on graph edge weights in time \widetildeO (m^3/2). Our definitions and algorithms naturally extend to directed graphs.} }
Endnote
%0 Conference Paper %T Algorithms for Lipschitz Learning on Graphs %A Rasmus Kyng %A Anup Rao %A Sushant Sachdeva %A Daniel A. Spielman %B Proceedings of The 28th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2015 %E Peter Grünwald %E Elad Hazan %E Satyen Kale %F pmlr-v40-Kyng15 %I PMLR %J Proceedings of Machine Learning Research %P 1190--1223 %U http://proceedings.mlr.press %V 40 %W PMLR %X We develop fast algorithms for solving regression problems on graphs where one is given the value of a function at some vertices, and must find its smoothest possible extension to all vertices. The extension we compute is the absolutely minimal Lipschitz extension, and is the limit for large p of p-Laplacian regularization. We present an algorithm that computes a minimal Lipschitz extension in expected linear time, and an algorithm that computes an absolutely minimal Lipschitz extension in expected time \widetildeO (m n). The latter algorithm has variants that seem to run much faster in practice. These extensions are particularly amenable to regularization: we can perform l_0-regularization on the given values in polynomial time and l_1-regularization on the initial function values and on graph edge weights in time \widetildeO (m^3/2). Our definitions and algorithms naturally extend to directed graphs.
RIS
TY - CPAPER TI - Algorithms for Lipschitz Learning on Graphs AU - Rasmus Kyng AU - Anup Rao AU - Sushant Sachdeva AU - Daniel A. Spielman BT - Proceedings of The 28th Conference on Learning Theory PY - 2015/06/26 DA - 2015/06/26 ED - Peter Grünwald ED - Elad Hazan ED - Satyen Kale ID - pmlr-v40-Kyng15 PB - PMLR SP - 1190 DP - PMLR EP - 1223 L1 - http://proceedings.mlr.press/v40/Kyng15.pdf UR - http://proceedings.mlr.press/v40/Kyng15.html AB - We develop fast algorithms for solving regression problems on graphs where one is given the value of a function at some vertices, and must find its smoothest possible extension to all vertices. The extension we compute is the absolutely minimal Lipschitz extension, and is the limit for large p of p-Laplacian regularization. We present an algorithm that computes a minimal Lipschitz extension in expected linear time, and an algorithm that computes an absolutely minimal Lipschitz extension in expected time \widetildeO (m n). The latter algorithm has variants that seem to run much faster in practice. These extensions are particularly amenable to regularization: we can perform l_0-regularization on the given values in polynomial time and l_1-regularization on the initial function values and on graph edge weights in time \widetildeO (m^3/2). Our definitions and algorithms naturally extend to directed graphs. ER -
APA
Kyng, R., Rao, A., Sachdeva, S. & Spielman, D.A.. (2015). Algorithms for Lipschitz Learning on Graphs. Proceedings of The 28th Conference on Learning Theory, in PMLR 40:1190-1223

Related Material