INO: Invariant Neural Operators for Learning Complex Physical Systems with Momentum Conservation

Ning Liu, Yue Yu, Huaiqian You, Neeraj Tatikola
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:6822-6838, 2023.

Abstract

Neural operators, which emerge as implicit solution operators of hidden governing equations, have recently become popular tools for learning responses of complex real-world physical systems. Nevertheless, the majority of neural operator applications has thus far been data-driven, which neglects the intrinsic preservation of fundamental physical laws in data. In this paper, we introduce a novel integral neural operator architecture, to learn physical models with fundamental conservation laws automatically guaranteed. In particular, by replacing the frame-dependent position information with its invariant counterpart in the kernel space, the proposed neural operator is designed to be translation- and rotation-invariant, and consequently abides by the conservation laws of linear and angular momentums. As applications, we demonstrate the expressivity and efficacy of our model in learning complex material behaviors from both synthetic and experimental datasets, and show that, by automatically satisfying these essential physical laws, our learned neural operator is not only generalizable in handling translated and rotated datasets, but also achieves improved accuracy and efficiency from the baseline neural operator models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-liu23f, title = {INO: Invariant Neural Operators for Learning Complex Physical Systems with Momentum Conservation}, author = {Liu, Ning and Yu, Yue and You, Huaiqian and Tatikola, Neeraj}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {6822--6838}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/liu23f/liu23f.pdf}, url = {https://proceedings.mlr.press/v206/liu23f.html}, abstract = {Neural operators, which emerge as implicit solution operators of hidden governing equations, have recently become popular tools for learning responses of complex real-world physical systems. Nevertheless, the majority of neural operator applications has thus far been data-driven, which neglects the intrinsic preservation of fundamental physical laws in data. In this paper, we introduce a novel integral neural operator architecture, to learn physical models with fundamental conservation laws automatically guaranteed. In particular, by replacing the frame-dependent position information with its invariant counterpart in the kernel space, the proposed neural operator is designed to be translation- and rotation-invariant, and consequently abides by the conservation laws of linear and angular momentums. As applications, we demonstrate the expressivity and efficacy of our model in learning complex material behaviors from both synthetic and experimental datasets, and show that, by automatically satisfying these essential physical laws, our learned neural operator is not only generalizable in handling translated and rotated datasets, but also achieves improved accuracy and efficiency from the baseline neural operator models.} }
Endnote
%0 Conference Paper %T INO: Invariant Neural Operators for Learning Complex Physical Systems with Momentum Conservation %A Ning Liu %A Yue Yu %A Huaiqian You %A Neeraj Tatikola %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-liu23f %I PMLR %P 6822--6838 %U https://proceedings.mlr.press/v206/liu23f.html %V 206 %X Neural operators, which emerge as implicit solution operators of hidden governing equations, have recently become popular tools for learning responses of complex real-world physical systems. Nevertheless, the majority of neural operator applications has thus far been data-driven, which neglects the intrinsic preservation of fundamental physical laws in data. In this paper, we introduce a novel integral neural operator architecture, to learn physical models with fundamental conservation laws automatically guaranteed. In particular, by replacing the frame-dependent position information with its invariant counterpart in the kernel space, the proposed neural operator is designed to be translation- and rotation-invariant, and consequently abides by the conservation laws of linear and angular momentums. As applications, we demonstrate the expressivity and efficacy of our model in learning complex material behaviors from both synthetic and experimental datasets, and show that, by automatically satisfying these essential physical laws, our learned neural operator is not only generalizable in handling translated and rotated datasets, but also achieves improved accuracy and efficiency from the baseline neural operator models.
APA
Liu, N., Yu, Y., You, H. & Tatikola, N.. (2023). INO: Invariant Neural Operators for Learning Complex Physical Systems with Momentum Conservation. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:6822-6838 Available from https://proceedings.mlr.press/v206/liu23f.html.

Related Material