Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers

Kayla Bollinger, Hayden Schaeffer
Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, PMLR 145:847-867, 2022.

Abstract

This paper presents a nonlinear model reduction method for systems of equations using a structured neural network. The neural network takes the form of a “three-layer” network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network. The Grassmann layer deter- mines the reduced basis for the input space, while the remaining layers approximate the nonlinear input-output system. The training alternates between learning the reduced basis and the nonlin- ear approximation, and is shown to be more effective than fixing the reduced basis and training the network only. An additional benefit of this approach is, for data that lie on low-dimensional subspaces, that the number of parameters in the network does not need to be large. We show that our method can be applied to scientific problems in the data-scarce regime, which is typically not well-suited for neural networks approximations. Examples include reduced order modeling for nonlinear dynamical systems and several aerospace engineering problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v145-bollinger22a, title = {Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers}, author = {Bollinger, Kayla and Schaeffer, Hayden}, booktitle = {Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference}, pages = {847--867}, year = {2022}, editor = {Bruna, Joan and Hesthaven, Jan and Zdeborova, Lenka}, volume = {145}, series = {Proceedings of Machine Learning Research}, month = {16--19 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v145/bollinger22a/bollinger22a.pdf}, url = {https://proceedings.mlr.press/v145/bollinger22a.html}, abstract = {This paper presents a nonlinear model reduction method for systems of equations using a structured neural network. The neural network takes the form of a “three-layer” network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network. The Grassmann layer deter- mines the reduced basis for the input space, while the remaining layers approximate the nonlinear input-output system. The training alternates between learning the reduced basis and the nonlin- ear approximation, and is shown to be more effective than fixing the reduced basis and training the network only. An additional benefit of this approach is, for data that lie on low-dimensional subspaces, that the number of parameters in the network does not need to be large. We show that our method can be applied to scientific problems in the data-scarce regime, which is typically not well-suited for neural networks approximations. Examples include reduced order modeling for nonlinear dynamical systems and several aerospace engineering problems. } }
Endnote
%0 Conference Paper %T Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers %A Kayla Bollinger %A Hayden Schaeffer %B Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference %C Proceedings of Machine Learning Research %D 2022 %E Joan Bruna %E Jan Hesthaven %E Lenka Zdeborova %F pmlr-v145-bollinger22a %I PMLR %P 847--867 %U https://proceedings.mlr.press/v145/bollinger22a.html %V 145 %X This paper presents a nonlinear model reduction method for systems of equations using a structured neural network. The neural network takes the form of a “three-layer” network with the first layer constrained to lie on the Grassmann manifold and the first activation function set to identity, while the remaining network is a standard two-layer ReLU neural network. The Grassmann layer deter- mines the reduced basis for the input space, while the remaining layers approximate the nonlinear input-output system. The training alternates between learning the reduced basis and the nonlin- ear approximation, and is shown to be more effective than fixing the reduced basis and training the network only. An additional benefit of this approach is, for data that lie on low-dimensional subspaces, that the number of parameters in the network does not need to be large. We show that our method can be applied to scientific problems in the data-scarce regime, which is typically not well-suited for neural networks approximations. Examples include reduced order modeling for nonlinear dynamical systems and several aerospace engineering problems.
APA
Bollinger, K. & Schaeffer, H.. (2022). Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers. Proceedings of the 2nd Mathematical and Scientific Machine Learning Conference, in Proceedings of Machine Learning Research 145:847-867 Available from https://proceedings.mlr.press/v145/bollinger22a.html.

Related Material