An Exact Kernel Equivalence for Finite Classification Models

Brian Wesley Bell, Michael Geyer, David Glickenstein, Amanda S Fernandez, Juston Moore
Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML), PMLR 221:206-217, 2023.

Abstract

We explore the equivalence between neural networks and kernel methods by deriving the first exact representation of any finite-size parametric classification model trained with gradient descent as a kernel machine. We compare our exact representation to the well-known Neural Tangent Kernel (NTK) and discuss approximation error relative to the NTK and other non-exact path kernel formulations. We experimentally demonstrate that the kernel can be computed for realistic networks up to machine precision. We use this exact kernel to show that our theoretical contribution can provide useful insights into the predictions made by neural networks, particularly the way in which they generalize.

Cite this Paper


BibTeX
@InProceedings{pmlr-v221-bell23a, title = {An Exact Kernel Equivalence for Finite Classification Models}, author = {Bell, Brian Wesley and Geyer, Michael and Glickenstein, David and Fernandez, Amanda S and Moore, Juston}, booktitle = {Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML)}, pages = {206--217}, year = {2023}, editor = {Doster, Timothy and Emerson, Tegan and Kvinge, Henry and Miolane, Nina and Papillon, Mathilde and Rieck, Bastian and Sanborn, Sophia}, volume = {221}, series = {Proceedings of Machine Learning Research}, month = {28 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v221/bell23a/bell23a.pdf}, url = {https://proceedings.mlr.press/v221/bell23a.html}, abstract = {We explore the equivalence between neural networks and kernel methods by deriving the first exact representation of any finite-size parametric classification model trained with gradient descent as a kernel machine. We compare our exact representation to the well-known Neural Tangent Kernel (NTK) and discuss approximation error relative to the NTK and other non-exact path kernel formulations. We experimentally demonstrate that the kernel can be computed for realistic networks up to machine precision. We use this exact kernel to show that our theoretical contribution can provide useful insights into the predictions made by neural networks, particularly the way in which they generalize.} }
Endnote
%0 Conference Paper %T An Exact Kernel Equivalence for Finite Classification Models %A Brian Wesley Bell %A Michael Geyer %A David Glickenstein %A Amanda S Fernandez %A Juston Moore %B Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML) %C Proceedings of Machine Learning Research %D 2023 %E Timothy Doster %E Tegan Emerson %E Henry Kvinge %E Nina Miolane %E Mathilde Papillon %E Bastian Rieck %E Sophia Sanborn %F pmlr-v221-bell23a %I PMLR %P 206--217 %U https://proceedings.mlr.press/v221/bell23a.html %V 221 %X We explore the equivalence between neural networks and kernel methods by deriving the first exact representation of any finite-size parametric classification model trained with gradient descent as a kernel machine. We compare our exact representation to the well-known Neural Tangent Kernel (NTK) and discuss approximation error relative to the NTK and other non-exact path kernel formulations. We experimentally demonstrate that the kernel can be computed for realistic networks up to machine precision. We use this exact kernel to show that our theoretical contribution can provide useful insights into the predictions made by neural networks, particularly the way in which they generalize.
APA
Bell, B.W., Geyer, M., Glickenstein, D., Fernandez, A.S. & Moore, J.. (2023). An Exact Kernel Equivalence for Finite Classification Models. Proceedings of 2nd Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML), in Proceedings of Machine Learning Research 221:206-217 Available from https://proceedings.mlr.press/v221/bell23a.html.

Related Material