Soft Matching Distance: A metric on neural representations that captures single-neuron tuning

Meenakshi Khosla, Alex H Williams
Proceedings of UniReps: the First Workshop on Unifying Representations in Neural Models, PMLR 243:326-341, 2024.

Abstract

Common measures of neural representational (dis)similarity are designed to be insensitive to rotations and reflections of the neural activation space. Motivated by the premise that the tuning of individual units may be important, there has been recent interest in developing stricter notions of representational (dis)similarity that require neurons to be individually matched across networks. When two networks have the same size (i.e. same number of neurons), a distance metric can be formulated by optimizing over neuron index permutations to maximize tuning curve alignment. However, it is not clear how to generalize this metric to measure distances between networks with different sizes. Here, we leverage a connection to optimal transport theory to derive a natural generalization based on “soft” permutations. The resulting metric is symmetric, satisfies the triangle inequality, and can be interpreted as a Wasserstein distance between two empirical distributions. Further, our proposed metric avoids counter-intuitive outcomes suffered by alternative approaches, and captures complementary geometric insights into neural representations that are entirely missed by rotation-invariant metrics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v243-khosla24a, title = {Soft Matching Distance: A metric on neural representations that captures single-neuron tuning}, author = {Khosla, Meenakshi and Williams, Alex H}, booktitle = {Proceedings of UniReps: the First Workshop on Unifying Representations in Neural Models}, pages = {326--341}, year = {2024}, editor = {Fumero, Marco and Rodolá, Emanuele and Domine, Clementine and Locatello, Francesco and Dziugaite, Karolina and Mathilde, Caron}, volume = {243}, series = {Proceedings of Machine Learning Research}, month = {15 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v243/khosla24a/khosla24a.pdf}, url = {https://proceedings.mlr.press/v243/khosla24a.html}, abstract = {Common measures of neural representational (dis)similarity are designed to be insensitive to rotations and reflections of the neural activation space. Motivated by the premise that the tuning of individual units may be important, there has been recent interest in developing stricter notions of representational (dis)similarity that require neurons to be individually matched across networks. When two networks have the same size (i.e. same number of neurons), a distance metric can be formulated by optimizing over neuron index permutations to maximize tuning curve alignment. However, it is not clear how to generalize this metric to measure distances between networks with different sizes. Here, we leverage a connection to optimal transport theory to derive a natural generalization based on “soft” permutations. The resulting metric is symmetric, satisfies the triangle inequality, and can be interpreted as a Wasserstein distance between two empirical distributions. Further, our proposed metric avoids counter-intuitive outcomes suffered by alternative approaches, and captures complementary geometric insights into neural representations that are entirely missed by rotation-invariant metrics.} }
Endnote
%0 Conference Paper %T Soft Matching Distance: A metric on neural representations that captures single-neuron tuning %A Meenakshi Khosla %A Alex H Williams %B Proceedings of UniReps: the First Workshop on Unifying Representations in Neural Models %C Proceedings of Machine Learning Research %D 2024 %E Marco Fumero %E Emanuele Rodolá %E Clementine Domine %E Francesco Locatello %E Karolina Dziugaite %E Caron Mathilde %F pmlr-v243-khosla24a %I PMLR %P 326--341 %U https://proceedings.mlr.press/v243/khosla24a.html %V 243 %X Common measures of neural representational (dis)similarity are designed to be insensitive to rotations and reflections of the neural activation space. Motivated by the premise that the tuning of individual units may be important, there has been recent interest in developing stricter notions of representational (dis)similarity that require neurons to be individually matched across networks. When two networks have the same size (i.e. same number of neurons), a distance metric can be formulated by optimizing over neuron index permutations to maximize tuning curve alignment. However, it is not clear how to generalize this metric to measure distances between networks with different sizes. Here, we leverage a connection to optimal transport theory to derive a natural generalization based on “soft” permutations. The resulting metric is symmetric, satisfies the triangle inequality, and can be interpreted as a Wasserstein distance between two empirical distributions. Further, our proposed metric avoids counter-intuitive outcomes suffered by alternative approaches, and captures complementary geometric insights into neural representations that are entirely missed by rotation-invariant metrics.
APA
Khosla, M. & Williams, A.H.. (2024). Soft Matching Distance: A metric on neural representations that captures single-neuron tuning. Proceedings of UniReps: the First Workshop on Unifying Representations in Neural Models, in Proceedings of Machine Learning Research 243:326-341 Available from https://proceedings.mlr.press/v243/khosla24a.html.

Related Material