Rethinking Neural-based Matrix Inversion: Why can’t, and Where can

Yuliang Ji, Jian Wu, Yuanzhe Xi
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:3583-3591, 2025.

Abstract

Deep neural networks have achieved substantial success across various scientific computing tasks. A pivotal challenge within this domain is the rapid and parallel approximation of matrix inverses, critical for numerous applications. Despite significant progress, there currently exists no universal neural-based method for approximating matrix inversion. This paper presents a theoretical analysis demonstrating the fundamental limitations of neural networks in developing a generalized matrix inversion model. We expand the class of Lipschitz functions to encompass a wider array of neural network models, thereby refining our theoretical approach. Moreover, we delineate specific conditions under which neural networks can effectively approximate matrix inverses. Our theoretical results are supported by experimental results from diverse matrix datasets, exploring the efficacy of neural networks in addressing the matrix inversion challenge.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-ji25a, title = {Rethinking Neural-based Matrix Inversion: Why can’t, and Where can}, author = {Ji, Yuliang and Wu, Jian and Xi, Yuanzhe}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {3583--3591}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/ji25a/ji25a.pdf}, url = {https://proceedings.mlr.press/v258/ji25a.html}, abstract = {Deep neural networks have achieved substantial success across various scientific computing tasks. A pivotal challenge within this domain is the rapid and parallel approximation of matrix inverses, critical for numerous applications. Despite significant progress, there currently exists no universal neural-based method for approximating matrix inversion. This paper presents a theoretical analysis demonstrating the fundamental limitations of neural networks in developing a generalized matrix inversion model. We expand the class of Lipschitz functions to encompass a wider array of neural network models, thereby refining our theoretical approach. Moreover, we delineate specific conditions under which neural networks can effectively approximate matrix inverses. Our theoretical results are supported by experimental results from diverse matrix datasets, exploring the efficacy of neural networks in addressing the matrix inversion challenge.} }
Endnote
%0 Conference Paper %T Rethinking Neural-based Matrix Inversion: Why can’t, and Where can %A Yuliang Ji %A Jian Wu %A Yuanzhe Xi %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-ji25a %I PMLR %P 3583--3591 %U https://proceedings.mlr.press/v258/ji25a.html %V 258 %X Deep neural networks have achieved substantial success across various scientific computing tasks. A pivotal challenge within this domain is the rapid and parallel approximation of matrix inverses, critical for numerous applications. Despite significant progress, there currently exists no universal neural-based method for approximating matrix inversion. This paper presents a theoretical analysis demonstrating the fundamental limitations of neural networks in developing a generalized matrix inversion model. We expand the class of Lipschitz functions to encompass a wider array of neural network models, thereby refining our theoretical approach. Moreover, we delineate specific conditions under which neural networks can effectively approximate matrix inverses. Our theoretical results are supported by experimental results from diverse matrix datasets, exploring the efficacy of neural networks in addressing the matrix inversion challenge.
APA
Ji, Y., Wu, J. & Xi, Y.. (2025). Rethinking Neural-based Matrix Inversion: Why can’t, and Where can. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:3583-3591 Available from https://proceedings.mlr.press/v258/ji25a.html.

Related Material