Reverse Engineering the Neural Tangent Kernel

James Benjamin Simon, Sajant Anand, Mike Deweese
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:20215-20231, 2022.

Abstract

The development of methods to guide the design of neural networks is an important open challenge for deep learning theory. As a paradigm for principled neural architecture design, we propose the translation of high-performing kernels, which are better-understood and amenable to first-principles design, into equivalent network architectures, which have superior efficiency, flexibility, and feature learning. To this end, we constructively prove that, with just an appropriate choice of activation function, any positive-semidefinite dot-product kernel can be realized as either the NNGP or neural tangent kernel of a fully-connected neural network with only one hidden layer. We verify our construction numerically and demonstrate its utility as a design tool for finite fully-connected networks in several experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-simon22a, title = {Reverse Engineering the Neural Tangent Kernel}, author = {Simon, James Benjamin and Anand, Sajant and Deweese, Mike}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {20215--20231}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/simon22a/simon22a.pdf}, url = {https://proceedings.mlr.press/v162/simon22a.html}, abstract = {The development of methods to guide the design of neural networks is an important open challenge for deep learning theory. As a paradigm for principled neural architecture design, we propose the translation of high-performing kernels, which are better-understood and amenable to first-principles design, into equivalent network architectures, which have superior efficiency, flexibility, and feature learning. To this end, we constructively prove that, with just an appropriate choice of activation function, any positive-semidefinite dot-product kernel can be realized as either the NNGP or neural tangent kernel of a fully-connected neural network with only one hidden layer. We verify our construction numerically and demonstrate its utility as a design tool for finite fully-connected networks in several experiments.} }
Endnote
%0 Conference Paper %T Reverse Engineering the Neural Tangent Kernel %A James Benjamin Simon %A Sajant Anand %A Mike Deweese %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-simon22a %I PMLR %P 20215--20231 %U https://proceedings.mlr.press/v162/simon22a.html %V 162 %X The development of methods to guide the design of neural networks is an important open challenge for deep learning theory. As a paradigm for principled neural architecture design, we propose the translation of high-performing kernels, which are better-understood and amenable to first-principles design, into equivalent network architectures, which have superior efficiency, flexibility, and feature learning. To this end, we constructively prove that, with just an appropriate choice of activation function, any positive-semidefinite dot-product kernel can be realized as either the NNGP or neural tangent kernel of a fully-connected neural network with only one hidden layer. We verify our construction numerically and demonstrate its utility as a design tool for finite fully-connected networks in several experiments.
APA
Simon, J.B., Anand, S. & Deweese, M.. (2022). Reverse Engineering the Neural Tangent Kernel. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:20215-20231 Available from https://proceedings.mlr.press/v162/simon22a.html.

Related Material