Adaptive Sampling for Continuous Group Equivariant Neural Networks

Berfin Inal, Gabriele Cesa
Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), PMLR 251:394-419, 2024.

Abstract

Steerable networks, which process data with intrinsic symmetries, often use Fourier-based non-linearities that require sampling from the entire group, leading to a need for discretization in continuous groups. As the number of samples increases, both performance and equivariance improve, yet this also leads to higher computational costs. To address this, we introduce an adaptive sampling approach that dynamically adjusts the sampling process to the symmetries in the data, reducing the number of required group samples and lowering the computational demands. We explore various implementations and their effects on model performance, equivariance, and computational efficiency. Our findings demonstrate improved model performance, and a marginal increase in memory efficiency

Cite this Paper


BibTeX
@InProceedings{pmlr-v251-inal24a, title = {Adaptive Sampling for Continuous Group Equivariant Neural Networks}, author = {Inal, Berfin and Cesa, Gabriele}, booktitle = {Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM)}, pages = {394--419}, year = {2024}, editor = {Vadgama, Sharvaree and Bekkers, Erik and Pouplin, Alison and Kaba, Sekou-Oumar and Walters, Robin and Lawrence, Hannah and Emerson, Tegan and Kvinge, Henry and Tomczak, Jakub and Jegelka, Stephanie}, volume = {251}, series = {Proceedings of Machine Learning Research}, month = {29 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v251/main/assets/inal24a/inal24a.pdf}, url = {https://proceedings.mlr.press/v251/inal24a.html}, abstract = {Steerable networks, which process data with intrinsic symmetries, often use Fourier-based non-linearities that require sampling from the entire group, leading to a need for discretization in continuous groups. As the number of samples increases, both performance and equivariance improve, yet this also leads to higher computational costs. To address this, we introduce an adaptive sampling approach that dynamically adjusts the sampling process to the symmetries in the data, reducing the number of required group samples and lowering the computational demands. We explore various implementations and their effects on model performance, equivariance, and computational efficiency. Our findings demonstrate improved model performance, and a marginal increase in memory efficiency} }
Endnote
%0 Conference Paper %T Adaptive Sampling for Continuous Group Equivariant Neural Networks %A Berfin Inal %A Gabriele Cesa %B Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM) %C Proceedings of Machine Learning Research %D 2024 %E Sharvaree Vadgama %E Erik Bekkers %E Alison Pouplin %E Sekou-Oumar Kaba %E Robin Walters %E Hannah Lawrence %E Tegan Emerson %E Henry Kvinge %E Jakub Tomczak %E Stephanie Jegelka %F pmlr-v251-inal24a %I PMLR %P 394--419 %U https://proceedings.mlr.press/v251/inal24a.html %V 251 %X Steerable networks, which process data with intrinsic symmetries, often use Fourier-based non-linearities that require sampling from the entire group, leading to a need for discretization in continuous groups. As the number of samples increases, both performance and equivariance improve, yet this also leads to higher computational costs. To address this, we introduce an adaptive sampling approach that dynamically adjusts the sampling process to the symmetries in the data, reducing the number of required group samples and lowering the computational demands. We explore various implementations and their effects on model performance, equivariance, and computational efficiency. Our findings demonstrate improved model performance, and a marginal increase in memory efficiency
APA
Inal, B. & Cesa, G.. (2024). Adaptive Sampling for Continuous Group Equivariant Neural Networks. Proceedings of the Geometry-grounded Representation Learning and Generative Modeling Workshop (GRaM), in Proceedings of Machine Learning Research 251:394-419 Available from https://proceedings.mlr.press/v251/inal24a.html.

Related Material