Improved Generalization of Weight Space Networks via Augmentations

Aviv Shamsian, Aviv Navon, David W. Zhang, Yan Zhang, Ethan Fetaya, Gal Chechik, Haggai Maron
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:44378-44393, 2024.

Abstract

Learning in deep weight spaces (DWS), where neural networks process the weights of other neural networks, is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs), as well as making inferences about other types of neural networks. Unfortunately, weight space models tend to suffer from substantial overfitting. We empirically analyze the reasons for this overfitting and find that a key reason is the lack of diversity in DWS datasets. While a given object can be represented by many different weight configurations, typical INR training sets fail to capture variability across INRs that represent the same object. To address this, we explore strategies for data augmentation in weight spaces and propose a MixUp method adapted for weight spaces. We demonstrate the effectiveness of these methods in two setups. In classification, they improve performance similarly to having up to 10 times more data. In self-supervised contrastive learning, they yield substantial 5-10% gains in downstream classification.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-shamsian24a, title = {Improved Generalization of Weight Space Networks via Augmentations}, author = {Shamsian, Aviv and Navon, Aviv and Zhang, David W. and Zhang, Yan and Fetaya, Ethan and Chechik, Gal and Maron, Haggai}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {44378--44393}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/shamsian24a/shamsian24a.pdf}, url = {https://proceedings.mlr.press/v235/shamsian24a.html}, abstract = {Learning in deep weight spaces (DWS), where neural networks process the weights of other neural networks, is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs), as well as making inferences about other types of neural networks. Unfortunately, weight space models tend to suffer from substantial overfitting. We empirically analyze the reasons for this overfitting and find that a key reason is the lack of diversity in DWS datasets. While a given object can be represented by many different weight configurations, typical INR training sets fail to capture variability across INRs that represent the same object. To address this, we explore strategies for data augmentation in weight spaces and propose a MixUp method adapted for weight spaces. We demonstrate the effectiveness of these methods in two setups. In classification, they improve performance similarly to having up to 10 times more data. In self-supervised contrastive learning, they yield substantial 5-10% gains in downstream classification.} }
Endnote
%0 Conference Paper %T Improved Generalization of Weight Space Networks via Augmentations %A Aviv Shamsian %A Aviv Navon %A David W. Zhang %A Yan Zhang %A Ethan Fetaya %A Gal Chechik %A Haggai Maron %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-shamsian24a %I PMLR %P 44378--44393 %U https://proceedings.mlr.press/v235/shamsian24a.html %V 235 %X Learning in deep weight spaces (DWS), where neural networks process the weights of other neural networks, is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs), as well as making inferences about other types of neural networks. Unfortunately, weight space models tend to suffer from substantial overfitting. We empirically analyze the reasons for this overfitting and find that a key reason is the lack of diversity in DWS datasets. While a given object can be represented by many different weight configurations, typical INR training sets fail to capture variability across INRs that represent the same object. To address this, we explore strategies for data augmentation in weight spaces and propose a MixUp method adapted for weight spaces. We demonstrate the effectiveness of these methods in two setups. In classification, they improve performance similarly to having up to 10 times more data. In self-supervised contrastive learning, they yield substantial 5-10% gains in downstream classification.
APA
Shamsian, A., Navon, A., Zhang, D.W., Zhang, Y., Fetaya, E., Chechik, G. & Maron, H.. (2024). Improved Generalization of Weight Space Networks via Augmentations. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:44378-44393 Available from https://proceedings.mlr.press/v235/shamsian24a.html.

Related Material