Size-Independent Sample Complexity of Neural Networks

Noah Golowich, Alexander Rakhlin, Ohad Shamir
Proceedings of the 31st Conference On Learning Theory, PMLR 75:297-299, 2018.

Abstract

We study the sample complexity of learning neural networks, by providing new bounds on their Rademacher complexity assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth, and under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v75-golowich18a, title = {Size-Independent Sample Complexity of Neural Networks}, author = {Golowich, Noah and Rakhlin, Alexander and Shamir, Ohad}, booktitle = {Proceedings of the 31st Conference On Learning Theory}, pages = {297--299}, year = {2018}, editor = {Bubeck, Sébastien and Perchet, Vianney and Rigollet, Philippe}, volume = {75}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v75/golowich18a/golowich18a.pdf}, url = {https://proceedings.mlr.press/v75/golowich18a.html}, abstract = {We study the sample complexity of learning neural networks, by providing new bounds on their Rademacher complexity assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth, and under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.} }
Endnote
%0 Conference Paper %T Size-Independent Sample Complexity of Neural Networks %A Noah Golowich %A Alexander Rakhlin %A Ohad Shamir %B Proceedings of the 31st Conference On Learning Theory %C Proceedings of Machine Learning Research %D 2018 %E Sébastien Bubeck %E Vianney Perchet %E Philippe Rigollet %F pmlr-v75-golowich18a %I PMLR %P 297--299 %U https://proceedings.mlr.press/v75/golowich18a.html %V 75 %X We study the sample complexity of learning neural networks, by providing new bounds on their Rademacher complexity assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth, and under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.
APA
Golowich, N., Rakhlin, A. & Shamir, O.. (2018). Size-Independent Sample Complexity of Neural Networks. Proceedings of the 31st Conference On Learning Theory, in Proceedings of Machine Learning Research 75:297-299 Available from https://proceedings.mlr.press/v75/golowich18a.html.

Related Material