Operator SVD with Neural Networks via Nested Low-Rank Approximation

Jongha Jon Ryu, Xiangxiang Xu, Hasan Sabri Melihcan Erol, Yuheng Bu, Lizhong Zheng, Gregory W. Wornell
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:42870-42905, 2024.

Abstract

Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading eigenvalues and eigenfunctions, is a fundamental task in many machine learning and scientific simulation problems. For high-dimensional eigenvalue problems, training neural networks to parameterize the eigenfunctions is considered as a promising alternative to the classical numerical linear algebra techniques. This paper proposes a new optimization framework based on the low-rank approximation characterization of a truncated singular value decomposition, accompanied by new techniques called nesting for learning the top-$L$ singular values and singular functions in the correct order. The proposed method promotes the desired orthogonality in the learned functions implicitly and efficiently via an unconstrained optimization formulation, which is easy to solve with off-the-shelf gradient-based optimization algorithms. We demonstrate the effectiveness of the proposed optimization framework for use cases in computational physics and machine learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-ryu24b, title = {Operator {SVD} with Neural Networks via Nested Low-Rank Approximation}, author = {Ryu, Jongha Jon and Xu, Xiangxiang and Erol, Hasan Sabri Melihcan and Bu, Yuheng and Zheng, Lizhong and Wornell, Gregory W.}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {42870--42905}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/ryu24b/ryu24b.pdf}, url = {https://proceedings.mlr.press/v235/ryu24b.html}, abstract = {Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading eigenvalues and eigenfunctions, is a fundamental task in many machine learning and scientific simulation problems. For high-dimensional eigenvalue problems, training neural networks to parameterize the eigenfunctions is considered as a promising alternative to the classical numerical linear algebra techniques. This paper proposes a new optimization framework based on the low-rank approximation characterization of a truncated singular value decomposition, accompanied by new techniques called nesting for learning the top-$L$ singular values and singular functions in the correct order. The proposed method promotes the desired orthogonality in the learned functions implicitly and efficiently via an unconstrained optimization formulation, which is easy to solve with off-the-shelf gradient-based optimization algorithms. We demonstrate the effectiveness of the proposed optimization framework for use cases in computational physics and machine learning.} }
Endnote
%0 Conference Paper %T Operator SVD with Neural Networks via Nested Low-Rank Approximation %A Jongha Jon Ryu %A Xiangxiang Xu %A Hasan Sabri Melihcan Erol %A Yuheng Bu %A Lizhong Zheng %A Gregory W. Wornell %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-ryu24b %I PMLR %P 42870--42905 %U https://proceedings.mlr.press/v235/ryu24b.html %V 235 %X Computing eigenvalue decomposition (EVD) of a given linear operator, or finding its leading eigenvalues and eigenfunctions, is a fundamental task in many machine learning and scientific simulation problems. For high-dimensional eigenvalue problems, training neural networks to parameterize the eigenfunctions is considered as a promising alternative to the classical numerical linear algebra techniques. This paper proposes a new optimization framework based on the low-rank approximation characterization of a truncated singular value decomposition, accompanied by new techniques called nesting for learning the top-$L$ singular values and singular functions in the correct order. The proposed method promotes the desired orthogonality in the learned functions implicitly and efficiently via an unconstrained optimization formulation, which is easy to solve with off-the-shelf gradient-based optimization algorithms. We demonstrate the effectiveness of the proposed optimization framework for use cases in computational physics and machine learning.
APA
Ryu, J.J., Xu, X., Erol, H.S.M., Bu, Y., Zheng, L. & Wornell, G.W.. (2024). Operator SVD with Neural Networks via Nested Low-Rank Approximation. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:42870-42905 Available from https://proceedings.mlr.press/v235/ryu24b.html.

Related Material