The Variational Nystrom method for large-scale spectral problems

Max Vladymyrov, Miguel Carreira-Perpinan
; Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:211-220, 2016.

Abstract

Spectral methods for dimensionality reduction and clustering require solving an eigenproblem defined by a sparse affinity matrix. When this matrix is large, one seeks an approximate solution. The standard way to do this is the Nystrom method, which first solves a small eigenproblem considering only a subset of landmark points, and then applies an out-of-sample formula to extrapolate the solution to the entire dataset. We show that by constraining the original problem to satisfy the Nystrom formula, we obtain an approximation that is computationally simple and efficient, but achieves a lower approximation error using fewer landmarks and less runtime. We also study the role of normalization in the computational cost and quality of the resulting solution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-vladymyrov16, title = {The Variational Nystrom method for large-scale spectral problems}, author = {Max Vladymyrov and Miguel Carreira-Perpinan}, pages = {211--220}, year = {2016}, editor = {Maria Florina Balcan and Kilian Q. Weinberger}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/vladymyrov16.pdf}, url = {http://proceedings.mlr.press/v48/vladymyrov16.html}, abstract = {Spectral methods for dimensionality reduction and clustering require solving an eigenproblem defined by a sparse affinity matrix. When this matrix is large, one seeks an approximate solution. The standard way to do this is the Nystrom method, which first solves a small eigenproblem considering only a subset of landmark points, and then applies an out-of-sample formula to extrapolate the solution to the entire dataset. We show that by constraining the original problem to satisfy the Nystrom formula, we obtain an approximation that is computationally simple and efficient, but achieves a lower approximation error using fewer landmarks and less runtime. We also study the role of normalization in the computational cost and quality of the resulting solution.} }
Endnote
%0 Conference Paper %T The Variational Nystrom method for large-scale spectral problems %A Max Vladymyrov %A Miguel Carreira-Perpinan %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-vladymyrov16 %I PMLR %J Proceedings of Machine Learning Research %P 211--220 %U http://proceedings.mlr.press %V 48 %W PMLR %X Spectral methods for dimensionality reduction and clustering require solving an eigenproblem defined by a sparse affinity matrix. When this matrix is large, one seeks an approximate solution. The standard way to do this is the Nystrom method, which first solves a small eigenproblem considering only a subset of landmark points, and then applies an out-of-sample formula to extrapolate the solution to the entire dataset. We show that by constraining the original problem to satisfy the Nystrom formula, we obtain an approximation that is computationally simple and efficient, but achieves a lower approximation error using fewer landmarks and less runtime. We also study the role of normalization in the computational cost and quality of the resulting solution.
RIS
TY - CPAPER TI - The Variational Nystrom method for large-scale spectral problems AU - Max Vladymyrov AU - Miguel Carreira-Perpinan BT - Proceedings of The 33rd International Conference on Machine Learning PY - 2016/06/11 DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-vladymyrov16 PB - PMLR SP - 211 DP - PMLR EP - 220 L1 - http://proceedings.mlr.press/v48/vladymyrov16.pdf UR - http://proceedings.mlr.press/v48/vladymyrov16.html AB - Spectral methods for dimensionality reduction and clustering require solving an eigenproblem defined by a sparse affinity matrix. When this matrix is large, one seeks an approximate solution. The standard way to do this is the Nystrom method, which first solves a small eigenproblem considering only a subset of landmark points, and then applies an out-of-sample formula to extrapolate the solution to the entire dataset. We show that by constraining the original problem to satisfy the Nystrom formula, we obtain an approximation that is computationally simple and efficient, but achieves a lower approximation error using fewer landmarks and less runtime. We also study the role of normalization in the computational cost and quality of the resulting solution. ER -
APA
Vladymyrov, M. & Carreira-Perpinan, M.. (2016). The Variational Nystrom method for large-scale spectral problems. Proceedings of The 33rd International Conference on Machine Learning, in PMLR 48:211-220

Related Material