Recycling Randomness with Structure for Sublinear time Kernel Expansions

Krzysztof Choromanski, Vikas Sindhwani
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:2502-2510, 2016.

Abstract

We propose a scheme for recycling Gaussian random vectors into structured matrices to ap- proximate various kernel functions in sublin- ear time via random embeddings. Our frame- work includes the Fastfood construction of Le et al. (2013) as a special case, but also ex- tends to Circulant, Toeplitz and Hankel matri- ces, and the broader family of structured matri- ces that are characterized by the concept of low- displacement rank. We introduce notions of co- herence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of ran- dom feature maps that arise within our frame- work. For the case of low-displacement matri- ces, we show how the degree of structure and randomness can be controlled to reduce statis- tical variance at the cost of increased computa- tion and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scal- ing up kernel methods using random features.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-choromanski16, title = {Recycling Randomness with Structure for Sublinear time Kernel Expansions}, author = {Choromanski, Krzysztof and Sindhwani, Vikas}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {2502--2510}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/choromanski16.pdf}, url = {https://proceedings.mlr.press/v48/choromanski16.html}, abstract = {We propose a scheme for recycling Gaussian random vectors into structured matrices to ap- proximate various kernel functions in sublin- ear time via random embeddings. Our frame- work includes the Fastfood construction of Le et al. (2013) as a special case, but also ex- tends to Circulant, Toeplitz and Hankel matri- ces, and the broader family of structured matri- ces that are characterized by the concept of low- displacement rank. We introduce notions of co- herence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of ran- dom feature maps that arise within our frame- work. For the case of low-displacement matri- ces, we show how the degree of structure and randomness can be controlled to reduce statis- tical variance at the cost of increased computa- tion and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scal- ing up kernel methods using random features.} }
Endnote
%0 Conference Paper %T Recycling Randomness with Structure for Sublinear time Kernel Expansions %A Krzysztof Choromanski %A Vikas Sindhwani %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-choromanski16 %I PMLR %P 2502--2510 %U https://proceedings.mlr.press/v48/choromanski16.html %V 48 %X We propose a scheme for recycling Gaussian random vectors into structured matrices to ap- proximate various kernel functions in sublin- ear time via random embeddings. Our frame- work includes the Fastfood construction of Le et al. (2013) as a special case, but also ex- tends to Circulant, Toeplitz and Hankel matri- ces, and the broader family of structured matri- ces that are characterized by the concept of low- displacement rank. We introduce notions of co- herence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of ran- dom feature maps that arise within our frame- work. For the case of low-displacement matri- ces, we show how the degree of structure and randomness can be controlled to reduce statis- tical variance at the cost of increased computa- tion and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scal- ing up kernel methods using random features.
RIS
TY - CPAPER TI - Recycling Randomness with Structure for Sublinear time Kernel Expansions AU - Krzysztof Choromanski AU - Vikas Sindhwani BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-choromanski16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 2502 EP - 2510 L1 - http://proceedings.mlr.press/v48/choromanski16.pdf UR - https://proceedings.mlr.press/v48/choromanski16.html AB - We propose a scheme for recycling Gaussian random vectors into structured matrices to ap- proximate various kernel functions in sublin- ear time via random embeddings. Our frame- work includes the Fastfood construction of Le et al. (2013) as a special case, but also ex- tends to Circulant, Toeplitz and Hankel matri- ces, and the broader family of structured matri- ces that are characterized by the concept of low- displacement rank. We introduce notions of co- herence and graph-theoretic structural constants that control the approximation quality, and prove unbiasedness and low-variance properties of ran- dom feature maps that arise within our frame- work. For the case of low-displacement matri- ces, we show how the degree of structure and randomness can be controlled to reduce statis- tical variance at the cost of increased computa- tion and storage requirements. Empirical results strongly support our theory and justify the use of a broader family of structured matrices for scal- ing up kernel methods using random features. ER -
APA
Choromanski, K. & Sindhwani, V.. (2016). Recycling Randomness with Structure for Sublinear time Kernel Expansions. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:2502-2510 Available from https://proceedings.mlr.press/v48/choromanski16.html.

Related Material