Deep Canonical Correlation Analysis

Galen Andrew, Raman Arora, Jeff Bilmes, Karen Livescu
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):1247-1255, 2013.

Abstract

We introduce Deep Canonical Correlation Analysis (DCCA), a method to learn complex nonlinear transformations of two views of data such that the resulting representations are highly linearly correlated. Parameters of both transformations are jointly learned to maximize the (regularized) total correlation. It can be viewed as a nonlinear extension of the linear method \emphcanonical correlation analysis (CCA). It is an alternative to the nonparametric method \emphkernel canonical correlation analysis (KCCA) for learning correlated nonlinear transformations. Unlike KCCA, DCCA does not require an inner product, and has the advantages of a parametric method: training time scales well with data size and the training data need not be referenced when computing the representations of unseen instances. In experiments on two real-world datasets, we find that DCCA learns representations with significantly higher correlation than those learned by CCA and KCCA. We also introduce a novel non-saturating sigmoid function based on the cube root that may be useful more generally in feedforward neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-andrew13, title = {Deep Canonical Correlation Analysis}, author = {Andrew, Galen and Arora, Raman and Bilmes, Jeff and Livescu, Karen}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {1247--1255}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/andrew13.pdf}, url = {https://proceedings.mlr.press/v28/andrew13.html}, abstract = {We introduce Deep Canonical Correlation Analysis (DCCA), a method to learn complex nonlinear transformations of two views of data such that the resulting representations are highly linearly correlated. Parameters of both transformations are jointly learned to maximize the (regularized) total correlation. It can be viewed as a nonlinear extension of the linear method \emphcanonical correlation analysis (CCA). It is an alternative to the nonparametric method \emphkernel canonical correlation analysis (KCCA) for learning correlated nonlinear transformations. Unlike KCCA, DCCA does not require an inner product, and has the advantages of a parametric method: training time scales well with data size and the training data need not be referenced when computing the representations of unseen instances. In experiments on two real-world datasets, we find that DCCA learns representations with significantly higher correlation than those learned by CCA and KCCA. We also introduce a novel non-saturating sigmoid function based on the cube root that may be useful more generally in feedforward neural networks.} }
Endnote
%0 Conference Paper %T Deep Canonical Correlation Analysis %A Galen Andrew %A Raman Arora %A Jeff Bilmes %A Karen Livescu %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-andrew13 %I PMLR %P 1247--1255 %U https://proceedings.mlr.press/v28/andrew13.html %V 28 %N 3 %X We introduce Deep Canonical Correlation Analysis (DCCA), a method to learn complex nonlinear transformations of two views of data such that the resulting representations are highly linearly correlated. Parameters of both transformations are jointly learned to maximize the (regularized) total correlation. It can be viewed as a nonlinear extension of the linear method \emphcanonical correlation analysis (CCA). It is an alternative to the nonparametric method \emphkernel canonical correlation analysis (KCCA) for learning correlated nonlinear transformations. Unlike KCCA, DCCA does not require an inner product, and has the advantages of a parametric method: training time scales well with data size and the training data need not be referenced when computing the representations of unseen instances. In experiments on two real-world datasets, we find that DCCA learns representations with significantly higher correlation than those learned by CCA and KCCA. We also introduce a novel non-saturating sigmoid function based on the cube root that may be useful more generally in feedforward neural networks.
RIS
TY - CPAPER TI - Deep Canonical Correlation Analysis AU - Galen Andrew AU - Raman Arora AU - Jeff Bilmes AU - Karen Livescu BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-andrew13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 1247 EP - 1255 L1 - http://proceedings.mlr.press/v28/andrew13.pdf UR - https://proceedings.mlr.press/v28/andrew13.html AB - We introduce Deep Canonical Correlation Analysis (DCCA), a method to learn complex nonlinear transformations of two views of data such that the resulting representations are highly linearly correlated. Parameters of both transformations are jointly learned to maximize the (regularized) total correlation. It can be viewed as a nonlinear extension of the linear method \emphcanonical correlation analysis (CCA). It is an alternative to the nonparametric method \emphkernel canonical correlation analysis (KCCA) for learning correlated nonlinear transformations. Unlike KCCA, DCCA does not require an inner product, and has the advantages of a parametric method: training time scales well with data size and the training data need not be referenced when computing the representations of unseen instances. In experiments on two real-world datasets, we find that DCCA learns representations with significantly higher correlation than those learned by CCA and KCCA. We also introduce a novel non-saturating sigmoid function based on the cube root that may be useful more generally in feedforward neural networks. ER -
APA
Andrew, G., Arora, R., Bilmes, J. & Livescu, K.. (2013). Deep Canonical Correlation Analysis. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):1247-1255 Available from https://proceedings.mlr.press/v28/andrew13.html.

Related Material