Convex Representation Learning for Generalized Invariance in Semi-Inner-Product Space

Yingyi Ma, Vignesh Ganapathiraman, Yaoliang Yu, Xinhua Zhang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6532-6542, 2020.

Abstract

Invariance (defined in a general sense) has been one of the most effective priors for representation learning. Direct factorization of parametric models is feasible only for a small range of invariances, while regularization approaches, despite improved generality, lead to nonconvex optimization. In this work, we develop a \emph{convex} representation learning algorithm for a variety of generalized invariances that can be modeled as semi-norms. Novel Euclidean embeddings are introduced for kernel representers in a semi-inner-product space, and approximation bounds are established. This allows invariant representations to be learned efficiently and effectively as confirmed in our experiments, along with accurate predictions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-ma20b, title = {Convex Representation Learning for Generalized Invariance in Semi-Inner-Product Space}, author = {Ma, Yingyi and Ganapathiraman, Vignesh and Yu, Yaoliang and Zhang, Xinhua}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6532--6542}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/ma20b/ma20b.pdf}, url = {https://proceedings.mlr.press/v119/ma20b.html}, abstract = {Invariance (defined in a general sense) has been one of the most effective priors for representation learning. Direct factorization of parametric models is feasible only for a small range of invariances, while regularization approaches, despite improved generality, lead to nonconvex optimization. In this work, we develop a \emph{convex} representation learning algorithm for a variety of generalized invariances that can be modeled as semi-norms. Novel Euclidean embeddings are introduced for kernel representers in a semi-inner-product space, and approximation bounds are established. This allows invariant representations to be learned efficiently and effectively as confirmed in our experiments, along with accurate predictions.} }
Endnote
%0 Conference Paper %T Convex Representation Learning for Generalized Invariance in Semi-Inner-Product Space %A Yingyi Ma %A Vignesh Ganapathiraman %A Yaoliang Yu %A Xinhua Zhang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-ma20b %I PMLR %P 6532--6542 %U https://proceedings.mlr.press/v119/ma20b.html %V 119 %X Invariance (defined in a general sense) has been one of the most effective priors for representation learning. Direct factorization of parametric models is feasible only for a small range of invariances, while regularization approaches, despite improved generality, lead to nonconvex optimization. In this work, we develop a \emph{convex} representation learning algorithm for a variety of generalized invariances that can be modeled as semi-norms. Novel Euclidean embeddings are introduced for kernel representers in a semi-inner-product space, and approximation bounds are established. This allows invariant representations to be learned efficiently and effectively as confirmed in our experiments, along with accurate predictions.
APA
Ma, Y., Ganapathiraman, V., Yu, Y. & Zhang, X.. (2020). Convex Representation Learning for Generalized Invariance in Semi-Inner-Product Space. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6532-6542 Available from https://proceedings.mlr.press/v119/ma20b.html.

Related Material