Double Nyström Method: An Efficient and Accurate Nyström Scheme for Large-Scale Data Sets

Woosang Lim, Minhwan Kim, Haesun Park, Kyomin Jung
; Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1367-1375, 2015.

Abstract

The Nyström method has been one of the most effective techniques for kernel-based approach that scales well to large data sets. Since its introduction, there has been a large body of work that improves the approximation accuracy while maintaining computational efficiency. In this paper, we present a novel Nyström method that improves both accuracy and efficiency based on a new theoretical analysis. We first provide a generalized sampling scheme, CAPS, that minimizes a novel error bound based on the subspace distance. We then present our double Nyström method that reduces the size of the decomposition in two stages. We show that our method is highly efficient and accurate compared to other state-of-the-art Nyström methods by evaluating them on a number of real data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-lima15, title = {Double Nystr\"om Method: An Efficient and Accurate Nystr\"om Scheme for Large-Scale Data Sets}, author = {Woosang Lim and Minhwan Kim and Haesun Park and Kyomin Jung}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1367--1375}, year = {2015}, editor = {Francis Bach and David Blei}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/lima15.pdf}, url = {http://proceedings.mlr.press/v37/lima15.html}, abstract = {The Nyström method has been one of the most effective techniques for kernel-based approach that scales well to large data sets. Since its introduction, there has been a large body of work that improves the approximation accuracy while maintaining computational efficiency. In this paper, we present a novel Nyström method that improves both accuracy and efficiency based on a new theoretical analysis. We first provide a generalized sampling scheme, CAPS, that minimizes a novel error bound based on the subspace distance. We then present our double Nyström method that reduces the size of the decomposition in two stages. We show that our method is highly efficient and accurate compared to other state-of-the-art Nyström methods by evaluating them on a number of real data sets.} }
Endnote
%0 Conference Paper %T Double Nyström Method: An Efficient and Accurate Nyström Scheme for Large-Scale Data Sets %A Woosang Lim %A Minhwan Kim %A Haesun Park %A Kyomin Jung %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-lima15 %I PMLR %J Proceedings of Machine Learning Research %P 1367--1375 %U http://proceedings.mlr.press %V 37 %W PMLR %X The Nyström method has been one of the most effective techniques for kernel-based approach that scales well to large data sets. Since its introduction, there has been a large body of work that improves the approximation accuracy while maintaining computational efficiency. In this paper, we present a novel Nyström method that improves both accuracy and efficiency based on a new theoretical analysis. We first provide a generalized sampling scheme, CAPS, that minimizes a novel error bound based on the subspace distance. We then present our double Nyström method that reduces the size of the decomposition in two stages. We show that our method is highly efficient and accurate compared to other state-of-the-art Nyström methods by evaluating them on a number of real data sets.
RIS
TY - CPAPER TI - Double Nyström Method: An Efficient and Accurate Nyström Scheme for Large-Scale Data Sets AU - Woosang Lim AU - Minhwan Kim AU - Haesun Park AU - Kyomin Jung BT - Proceedings of the 32nd International Conference on Machine Learning PY - 2015/06/01 DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-lima15 PB - PMLR SP - 1367 DP - PMLR EP - 1375 L1 - http://proceedings.mlr.press/v37/lima15.pdf UR - http://proceedings.mlr.press/v37/lima15.html AB - The Nyström method has been one of the most effective techniques for kernel-based approach that scales well to large data sets. Since its introduction, there has been a large body of work that improves the approximation accuracy while maintaining computational efficiency. In this paper, we present a novel Nyström method that improves both accuracy and efficiency based on a new theoretical analysis. We first provide a generalized sampling scheme, CAPS, that minimizes a novel error bound based on the subspace distance. We then present our double Nyström method that reduces the size of the decomposition in two stages. We show that our method is highly efficient and accurate compared to other state-of-the-art Nyström methods by evaluating them on a number of real data sets. ER -
APA
Lim, W., Kim, M., Park, H. & Jung, K.. (2015). Double Nyström Method: An Efficient and Accurate Nyström Scheme for Large-Scale Data Sets. Proceedings of the 32nd International Conference on Machine Learning, in PMLR 37:1367-1375

Related Material