The Galerkin method beats Graph-Based Approaches for Spectral Algorithms

Vivien A Cabannnes, Francis Bach
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:451-459, 2024.

Abstract

Historically, the machine learning community has derived spectral decompositions from graph-based approaches. We break with this approach and prove the statistical and computational superiority of the Galerkin method, which consists in restricting the study to a small set of test functions. In particular, we introduce implementation tricks to deal with differential operators in large dimensions with structured kernels. Finally, we extend on the core principles beyond our approach to apply them to non-linear spaces of functions, such as the ones parameterized by deep neural networks, through loss-based optimization procedures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-a-cabannnes24a, title = { The {G}alerkin method beats Graph-Based Approaches for Spectral Algorithms }, author = {A Cabannnes, Vivien and Bach, Francis}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {451--459}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/a-cabannnes24a/a-cabannnes24a.pdf}, url = {https://proceedings.mlr.press/v238/a-cabannnes24a.html}, abstract = { Historically, the machine learning community has derived spectral decompositions from graph-based approaches. We break with this approach and prove the statistical and computational superiority of the Galerkin method, which consists in restricting the study to a small set of test functions. In particular, we introduce implementation tricks to deal with differential operators in large dimensions with structured kernels. Finally, we extend on the core principles beyond our approach to apply them to non-linear spaces of functions, such as the ones parameterized by deep neural networks, through loss-based optimization procedures. } }
Endnote
%0 Conference Paper %T The Galerkin method beats Graph-Based Approaches for Spectral Algorithms %A Vivien A Cabannnes %A Francis Bach %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-a-cabannnes24a %I PMLR %P 451--459 %U https://proceedings.mlr.press/v238/a-cabannnes24a.html %V 238 %X Historically, the machine learning community has derived spectral decompositions from graph-based approaches. We break with this approach and prove the statistical and computational superiority of the Galerkin method, which consists in restricting the study to a small set of test functions. In particular, we introduce implementation tricks to deal with differential operators in large dimensions with structured kernels. Finally, we extend on the core principles beyond our approach to apply them to non-linear spaces of functions, such as the ones parameterized by deep neural networks, through loss-based optimization procedures.
APA
A Cabannnes, V. & Bach, F.. (2024). The Galerkin method beats Graph-Based Approaches for Spectral Algorithms . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:451-459 Available from https://proceedings.mlr.press/v238/a-cabannnes24a.html.

Related Material