Multi-Fidelity Residual Neural Processes for Scalable Surrogate Modeling

Ruijia Niu, Dongxia Wu, Kai Kim, Yian Ma, Duncan Watson-Parris, Rose Yu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:38381-38394, 2024.

Abstract

Multi-fidelity surrogate modeling aims to learn an accurate surrogate at the highest fidelity level by combining data from multiple sources. Traditional methods relying on Gaussian processes can hardly scale to high-dimensional data. Deep learning approaches utilize neural network based encoders and decoders to improve scalability. These approaches share encoded representations across fidelities without including corresponding decoder parameters. This hinders inference performance, especially in out-of-distribution scenarios when the highest fidelity data has limited domain coverage. To address these limitations, we propose Multi-fidelity Residual Neural Processes (MFRNP), a novel multi-fidelity surrogate modeling framework. MFRNP explicitly models the residual between the aggregated output from lower fidelities and ground truth at the highest fidelity. The aggregation introduces decoders into the information sharing step and optimizes lower fidelity decoders to accurately capture both in-fidelity and cross-fidelity information. We show that MFRNP significantly outperforms state-of-the-art in learning partial differential equations and a real-world climate modeling task. Our code is published at: https://github.com/Rose-STL-Lab/MFRNP

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-niu24d, title = {Multi-Fidelity Residual Neural Processes for Scalable Surrogate Modeling}, author = {Niu, Ruijia and Wu, Dongxia and Kim, Kai and Ma, Yian and Watson-Parris, Duncan and Yu, Rose}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {38381--38394}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/niu24d/niu24d.pdf}, url = {https://proceedings.mlr.press/v235/niu24d.html}, abstract = {Multi-fidelity surrogate modeling aims to learn an accurate surrogate at the highest fidelity level by combining data from multiple sources. Traditional methods relying on Gaussian processes can hardly scale to high-dimensional data. Deep learning approaches utilize neural network based encoders and decoders to improve scalability. These approaches share encoded representations across fidelities without including corresponding decoder parameters. This hinders inference performance, especially in out-of-distribution scenarios when the highest fidelity data has limited domain coverage. To address these limitations, we propose Multi-fidelity Residual Neural Processes (MFRNP), a novel multi-fidelity surrogate modeling framework. MFRNP explicitly models the residual between the aggregated output from lower fidelities and ground truth at the highest fidelity. The aggregation introduces decoders into the information sharing step and optimizes lower fidelity decoders to accurately capture both in-fidelity and cross-fidelity information. We show that MFRNP significantly outperforms state-of-the-art in learning partial differential equations and a real-world climate modeling task. Our code is published at: https://github.com/Rose-STL-Lab/MFRNP} }
Endnote
%0 Conference Paper %T Multi-Fidelity Residual Neural Processes for Scalable Surrogate Modeling %A Ruijia Niu %A Dongxia Wu %A Kai Kim %A Yian Ma %A Duncan Watson-Parris %A Rose Yu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-niu24d %I PMLR %P 38381--38394 %U https://proceedings.mlr.press/v235/niu24d.html %V 235 %X Multi-fidelity surrogate modeling aims to learn an accurate surrogate at the highest fidelity level by combining data from multiple sources. Traditional methods relying on Gaussian processes can hardly scale to high-dimensional data. Deep learning approaches utilize neural network based encoders and decoders to improve scalability. These approaches share encoded representations across fidelities without including corresponding decoder parameters. This hinders inference performance, especially in out-of-distribution scenarios when the highest fidelity data has limited domain coverage. To address these limitations, we propose Multi-fidelity Residual Neural Processes (MFRNP), a novel multi-fidelity surrogate modeling framework. MFRNP explicitly models the residual between the aggregated output from lower fidelities and ground truth at the highest fidelity. The aggregation introduces decoders into the information sharing step and optimizes lower fidelity decoders to accurately capture both in-fidelity and cross-fidelity information. We show that MFRNP significantly outperforms state-of-the-art in learning partial differential equations and a real-world climate modeling task. Our code is published at: https://github.com/Rose-STL-Lab/MFRNP
APA
Niu, R., Wu, D., Kim, K., Ma, Y., Watson-Parris, D. & Yu, R.. (2024). Multi-Fidelity Residual Neural Processes for Scalable Surrogate Modeling. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:38381-38394 Available from https://proceedings.mlr.press/v235/niu24d.html.

Related Material