Mechanistic Mode Connectivity

Ekdeep Singh Lubana, Eric J Bigelow, Robert P. Dick, David Krueger, Hidenori Tanaka
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:22965-23004, 2023.

Abstract

We study neural network loss landscapes through the lens of mode connectivity, the observation that minimizers of neural networks retrieved via training on a dataset are connected via simple paths of low loss. Specifically, we ask the following question: are minimizers that rely on different mechanisms for making their predictions connected via simple paths of low loss? We provide a definition of mechanistic similarity as shared invariances to input transformations and demonstrate that lack of linear connectivity between two models implies they use dissimilar mechanisms for making their predictions. Relevant to practice, this result helps us demonstrate that naive fine-tuning on a downstream dataset can fail to alter a model’s mechanisms, e.g., fine-tuning can fail to eliminate a model’s reliance on spurious attributes. Our analysis also motivates a method for targeted alteration of a model’s mechanisms, named connectivity-based fine-tuning (CBFT), which we analyze using several synthetic datasets for the task of reducing a model’s reliance on spurious attributes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-lubana23a, title = {Mechanistic Mode Connectivity}, author = {Lubana, Ekdeep Singh and Bigelow, Eric J and Dick, Robert P. and Krueger, David and Tanaka, Hidenori}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {22965--23004}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/lubana23a/lubana23a.pdf}, url = {https://proceedings.mlr.press/v202/lubana23a.html}, abstract = {We study neural network loss landscapes through the lens of mode connectivity, the observation that minimizers of neural networks retrieved via training on a dataset are connected via simple paths of low loss. Specifically, we ask the following question: are minimizers that rely on different mechanisms for making their predictions connected via simple paths of low loss? We provide a definition of mechanistic similarity as shared invariances to input transformations and demonstrate that lack of linear connectivity between two models implies they use dissimilar mechanisms for making their predictions. Relevant to practice, this result helps us demonstrate that naive fine-tuning on a downstream dataset can fail to alter a model’s mechanisms, e.g., fine-tuning can fail to eliminate a model’s reliance on spurious attributes. Our analysis also motivates a method for targeted alteration of a model’s mechanisms, named connectivity-based fine-tuning (CBFT), which we analyze using several synthetic datasets for the task of reducing a model’s reliance on spurious attributes.} }
Endnote
%0 Conference Paper %T Mechanistic Mode Connectivity %A Ekdeep Singh Lubana %A Eric J Bigelow %A Robert P. Dick %A David Krueger %A Hidenori Tanaka %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-lubana23a %I PMLR %P 22965--23004 %U https://proceedings.mlr.press/v202/lubana23a.html %V 202 %X We study neural network loss landscapes through the lens of mode connectivity, the observation that minimizers of neural networks retrieved via training on a dataset are connected via simple paths of low loss. Specifically, we ask the following question: are minimizers that rely on different mechanisms for making their predictions connected via simple paths of low loss? We provide a definition of mechanistic similarity as shared invariances to input transformations and demonstrate that lack of linear connectivity between two models implies they use dissimilar mechanisms for making their predictions. Relevant to practice, this result helps us demonstrate that naive fine-tuning on a downstream dataset can fail to alter a model’s mechanisms, e.g., fine-tuning can fail to eliminate a model’s reliance on spurious attributes. Our analysis also motivates a method for targeted alteration of a model’s mechanisms, named connectivity-based fine-tuning (CBFT), which we analyze using several synthetic datasets for the task of reducing a model’s reliance on spurious attributes.
APA
Lubana, E.S., Bigelow, E.J., Dick, R.P., Krueger, D. & Tanaka, H.. (2023). Mechanistic Mode Connectivity. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:22965-23004 Available from https://proceedings.mlr.press/v202/lubana23a.html.

Related Material