LowPrecision Random Fourier Features for Memoryconstrained Kernel Approximation
[edit]
Proceedings of Machine Learning Research, PMLR 89:12641274, 2019.
Abstract
We investigate how to train kernel approximation methods that generalize well under a memory budget. Building on recent theoretical work, we define a measure of kernel approximation error which we find to be more predictive of the empirical generalization performance of kernel approximation methods than conventional metrics. An important consequence of this definition is that a kernel approximation matrix must be high rank to attain close approximation. Because storing a highrank approximation is memory intensive, we propose using a lowprecision quantization of random Fourier features (LPRFFs) to build a highrank approximation under a memory budget. Theoretically, we show quantization has a negligible effect on generalization performance in important settings. Empirically, we demonstrate across four benchmark datasets that LPRFFs can match the performance of fullprecision RFFs and the Nyström method, with 3x10x and 50x460x less memory, respectively.
Related Material


