Learning Invariant Representations with Kernel Warping

Yingyi Ma, Vignesh Ganapathiraman, Xinhua Zhang
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1003-1012, 2019.

Abstract

Invariance is an effective prior that has been extensively used to bias supervised learning with a \emph{given} representation of data. In order to learn invariant representations, wavelet and scattering based methods “hard code” invariance over the \emph{entire} sample space, hence restricted to a limited range of transformations. Kernels based on Haar integration also work only on a \emph{group} of transformations. In this work, we break this limitation by designing a new representation learning algorithm that incorporates invariances \emph{beyond transformation}. Our approach, which is based on warping the kernel in a data-dependent fashion, is computationally efficient using random features, and leads to a deep kernel through multiple layers. We apply it to convolutional kernel networks and demonstrate its stability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-ma19a, title = {Learning Invariant Representations with Kernel Warping}, author = {Ma, Yingyi and Ganapathiraman, Vignesh and Zhang, Xinhua}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1003--1012}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/ma19a/ma19a.pdf}, url = {https://proceedings.mlr.press/v89/ma19a.html}, abstract = {Invariance is an effective prior that has been extensively used to bias supervised learning with a \emph{given} representation of data. In order to learn invariant representations, wavelet and scattering based methods “hard code” invariance over the \emph{entire} sample space, hence restricted to a limited range of transformations. Kernels based on Haar integration also work only on a \emph{group} of transformations. In this work, we break this limitation by designing a new representation learning algorithm that incorporates invariances \emph{beyond transformation}. Our approach, which is based on warping the kernel in a data-dependent fashion, is computationally efficient using random features, and leads to a deep kernel through multiple layers. We apply it to convolutional kernel networks and demonstrate its stability.} }
Endnote
%0 Conference Paper %T Learning Invariant Representations with Kernel Warping %A Yingyi Ma %A Vignesh Ganapathiraman %A Xinhua Zhang %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-ma19a %I PMLR %P 1003--1012 %U https://proceedings.mlr.press/v89/ma19a.html %V 89 %X Invariance is an effective prior that has been extensively used to bias supervised learning with a \emph{given} representation of data. In order to learn invariant representations, wavelet and scattering based methods “hard code” invariance over the \emph{entire} sample space, hence restricted to a limited range of transformations. Kernels based on Haar integration also work only on a \emph{group} of transformations. In this work, we break this limitation by designing a new representation learning algorithm that incorporates invariances \emph{beyond transformation}. Our approach, which is based on warping the kernel in a data-dependent fashion, is computationally efficient using random features, and leads to a deep kernel through multiple layers. We apply it to convolutional kernel networks and demonstrate its stability.
APA
Ma, Y., Ganapathiraman, V. & Zhang, X.. (2019). Learning Invariant Representations with Kernel Warping. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1003-1012 Available from https://proceedings.mlr.press/v89/ma19a.html.

Related Material