Beyond Regular Grids: Fourier-Based Neural Operators on Arbitrary Domains

Levi E. Lingsch, Mike Yan Michelis, Emmanuel De Bezenac, Sirani M. Perera, Robert K. Katzschmann, Siddhartha Mishra
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:30610-30629, 2024.

Abstract

The computational efficiency of many neural operators, widely used for learning solutions of PDEs, relies on the fast Fourier transform (FFT) for performing spectral computations. As the FFT is limited to equispaced (rectangular) grids, this limits the efficiency of such neural operators when applied to problems where the input and output functions need to be processed on general non-equispaced point distributions. Leveraging the observation that a limited set of Fourier (Spectral) modes suffice to provide the required expressivity of a neural operator, we propose a simple method, based on the efficient direct evaluation of the underlying spectral transformation, to extend neural operators to arbitrary domains. An efficient implementation of such direct spectral evaluations is coupled with existing neural operator models to allow the processing of data on arbitrary non-equispaced distributions of points. With extensive empirical evaluation, we demonstrate that the proposed method allows us to extend neural operators to arbitrary point distributions with significant gains in training speed over baselines, while retaining or improving the accuracy of Fourier neural operators (FNOs) and related neural operators.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-lingsch24a, title = {Beyond Regular Grids: {F}ourier-Based Neural Operators on Arbitrary Domains}, author = {Lingsch, Levi E. and Michelis, Mike Yan and De Bezenac, Emmanuel and M. Perera, Sirani and Katzschmann, Robert K. and Mishra, Siddhartha}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {30610--30629}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/lingsch24a/lingsch24a.pdf}, url = {https://proceedings.mlr.press/v235/lingsch24a.html}, abstract = {The computational efficiency of many neural operators, widely used for learning solutions of PDEs, relies on the fast Fourier transform (FFT) for performing spectral computations. As the FFT is limited to equispaced (rectangular) grids, this limits the efficiency of such neural operators when applied to problems where the input and output functions need to be processed on general non-equispaced point distributions. Leveraging the observation that a limited set of Fourier (Spectral) modes suffice to provide the required expressivity of a neural operator, we propose a simple method, based on the efficient direct evaluation of the underlying spectral transformation, to extend neural operators to arbitrary domains. An efficient implementation of such direct spectral evaluations is coupled with existing neural operator models to allow the processing of data on arbitrary non-equispaced distributions of points. With extensive empirical evaluation, we demonstrate that the proposed method allows us to extend neural operators to arbitrary point distributions with significant gains in training speed over baselines, while retaining or improving the accuracy of Fourier neural operators (FNOs) and related neural operators.} }
Endnote
%0 Conference Paper %T Beyond Regular Grids: Fourier-Based Neural Operators on Arbitrary Domains %A Levi E. Lingsch %A Mike Yan Michelis %A Emmanuel De Bezenac %A Sirani M. Perera %A Robert K. Katzschmann %A Siddhartha Mishra %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-lingsch24a %I PMLR %P 30610--30629 %U https://proceedings.mlr.press/v235/lingsch24a.html %V 235 %X The computational efficiency of many neural operators, widely used for learning solutions of PDEs, relies on the fast Fourier transform (FFT) for performing spectral computations. As the FFT is limited to equispaced (rectangular) grids, this limits the efficiency of such neural operators when applied to problems where the input and output functions need to be processed on general non-equispaced point distributions. Leveraging the observation that a limited set of Fourier (Spectral) modes suffice to provide the required expressivity of a neural operator, we propose a simple method, based on the efficient direct evaluation of the underlying spectral transformation, to extend neural operators to arbitrary domains. An efficient implementation of such direct spectral evaluations is coupled with existing neural operator models to allow the processing of data on arbitrary non-equispaced distributions of points. With extensive empirical evaluation, we demonstrate that the proposed method allows us to extend neural operators to arbitrary point distributions with significant gains in training speed over baselines, while retaining or improving the accuracy of Fourier neural operators (FNOs) and related neural operators.
APA
Lingsch, L.E., Michelis, M.Y., De Bezenac, E., M. Perera, S., Katzschmann, R.K. & Mishra, S.. (2024). Beyond Regular Grids: Fourier-Based Neural Operators on Arbitrary Domains. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:30610-30629 Available from https://proceedings.mlr.press/v235/lingsch24a.html.

Related Material