Information in Infinite Ensembles of Infinitely-Wide Neural Networks

Ravid Shwartz-Ziv, Alexander A Alemi
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-17, 2020.

Abstract

In this preliminary work, we study the generalization properties of in nite ensembles of in nitely-wide neural networks. Amazingly, this model family admits tractable calculations for many information-theoretic quantities. We report analytical and empirical investigations in the search for signals that correlate with generalization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-shwartz-ziv20a, title = {Information in Infinite Ensembles of Infinitely-Wide Neural Networks }, author = {Shwartz-Ziv, Ravid and Alemi, Alexander A}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--17}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/shwartz-ziv20a/shwartz-ziv20a.pdf}, url = {https://proceedings.mlr.press/v118/shwartz-ziv20a.html}, abstract = { In this preliminary work, we study the generalization properties of in nite ensembles of in nitely-wide neural networks. Amazingly, this model family admits tractable calculations for many information-theoretic quantities. We report analytical and empirical investigations in the search for signals that correlate with generalization.} }
Endnote
%0 Conference Paper %T Information in Infinite Ensembles of Infinitely-Wide Neural Networks %A Ravid Shwartz-Ziv %A Alexander A Alemi %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-shwartz-ziv20a %I PMLR %P 1--17 %U https://proceedings.mlr.press/v118/shwartz-ziv20a.html %V 118 %X In this preliminary work, we study the generalization properties of in nite ensembles of in nitely-wide neural networks. Amazingly, this model family admits tractable calculations for many information-theoretic quantities. We report analytical and empirical investigations in the search for signals that correlate with generalization.
APA
Shwartz-Ziv, R. & Alemi, A.A.. (2020). Information in Infinite Ensembles of Infinitely-Wide Neural Networks . Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-17 Available from https://proceedings.mlr.press/v118/shwartz-ziv20a.html.

Related Material