Equivalence between representational similarity analysis, centered kernel alignment, and canonical correlations analysis

Alex H Williams
Proceedings of UniReps: the Second Edition of the Workshop on Unifying Representations in Neural Models, PMLR 285:10-23, 2024.

Abstract

Centered kernel alignment (CKA) and representational similarity analysis (RSA) of dissimilarity matrices are two popular methods for quantifying similarity in neural representational geometry. Although they follow a conceptually similar approach, typical implementations of CKA and RSA tend to result in numerically different outcomes. Here, I show that these two approaches are largely equivalent once one incorporates a mean-centering step into RSA. This connection is quite simple to derive, but appears to have been thus far overlooked by the community studying neural representational geometry. By unifying these measures, this paper hopes to simplify a complex and fragmented literature on this subject.

Cite this Paper


BibTeX
@InProceedings{pmlr-v285-williams24a, title = {Equivalence between representational similarity analysis, centered kernel alignment, and canonical correlations analysis}, author = {Williams, Alex H}, booktitle = {Proceedings of UniReps: the Second Edition of the Workshop on Unifying Representations in Neural Models}, pages = {10--23}, year = {2024}, editor = {Fumero, Marco and Domine, Clementine and Lähner, Zorah and Crisostomi, Donato and Moschella, Luca and Stachenfeld, Kimberly}, volume = {285}, series = {Proceedings of Machine Learning Research}, month = {14 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v285/main/assets/williams24a/williams24a.pdf}, url = {https://proceedings.mlr.press/v285/williams24a.html}, abstract = {Centered kernel alignment (CKA) and representational similarity analysis (RSA) of dissimilarity matrices are two popular methods for quantifying similarity in neural representational geometry. Although they follow a conceptually similar approach, typical implementations of CKA and RSA tend to result in numerically different outcomes. Here, I show that these two approaches are largely equivalent once one incorporates a mean-centering step into RSA. This connection is quite simple to derive, but appears to have been thus far overlooked by the community studying neural representational geometry. By unifying these measures, this paper hopes to simplify a complex and fragmented literature on this subject.} }
Endnote
%0 Conference Paper %T Equivalence between representational similarity analysis, centered kernel alignment, and canonical correlations analysis %A Alex H Williams %B Proceedings of UniReps: the Second Edition of the Workshop on Unifying Representations in Neural Models %C Proceedings of Machine Learning Research %D 2024 %E Marco Fumero %E Clementine Domine %E Zorah Lähner %E Donato Crisostomi %E Luca Moschella %E Kimberly Stachenfeld %F pmlr-v285-williams24a %I PMLR %P 10--23 %U https://proceedings.mlr.press/v285/williams24a.html %V 285 %X Centered kernel alignment (CKA) and representational similarity analysis (RSA) of dissimilarity matrices are two popular methods for quantifying similarity in neural representational geometry. Although they follow a conceptually similar approach, typical implementations of CKA and RSA tend to result in numerically different outcomes. Here, I show that these two approaches are largely equivalent once one incorporates a mean-centering step into RSA. This connection is quite simple to derive, but appears to have been thus far overlooked by the community studying neural representational geometry. By unifying these measures, this paper hopes to simplify a complex and fragmented literature on this subject.
APA
Williams, A.H.. (2024). Equivalence between representational similarity analysis, centered kernel alignment, and canonical correlations analysis. Proceedings of UniReps: the Second Edition of the Workshop on Unifying Representations in Neural Models, in Proceedings of Machine Learning Research 285:10-23 Available from https://proceedings.mlr.press/v285/williams24a.html.

Related Material