Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint

Hao Liu, Minshuo Chen, Siawpeng Er, Wenjing Liao, Tong Zhang, Tuo Zhao
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:13669-13703, 2022.

Abstract

Overparameterized neural networks enjoy great representation power on complex data, and more importantly yield sufficiently smooth output, which is crucial to their generalization and robustness. Most existing function approximation theories suggest that with sufficiently many parameters, neural networks can well approximate certain classes of functions in terms of the function value. The neural network themselves, however, can be highly nonsmooth. To bridge this gap, we take convolutional residual networks (ConvResNets) as an example, and prove that large ConvResNets can not only approximate a target function in terms of function value, but also exhibit sufficient first-order smoothness. Moreover, we extend our theory to approximating functions supported on a low-dimensional manifold. Our theory partially justifies the benefits of using deep and wide networks in practice. Numerical experiments on adversarial robust image classification are provided to support our theory.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-liu22c, title = {Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint}, author = {Liu, Hao and Chen, Minshuo and Er, Siawpeng and Liao, Wenjing and Zhang, Tong and Zhao, Tuo}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {13669--13703}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/liu22c/liu22c.pdf}, url = {https://proceedings.mlr.press/v162/liu22c.html}, abstract = {Overparameterized neural networks enjoy great representation power on complex data, and more importantly yield sufficiently smooth output, which is crucial to their generalization and robustness. Most existing function approximation theories suggest that with sufficiently many parameters, neural networks can well approximate certain classes of functions in terms of the function value. The neural network themselves, however, can be highly nonsmooth. To bridge this gap, we take convolutional residual networks (ConvResNets) as an example, and prove that large ConvResNets can not only approximate a target function in terms of function value, but also exhibit sufficient first-order smoothness. Moreover, we extend our theory to approximating functions supported on a low-dimensional manifold. Our theory partially justifies the benefits of using deep and wide networks in practice. Numerical experiments on adversarial robust image classification are provided to support our theory.} }
Endnote
%0 Conference Paper %T Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint %A Hao Liu %A Minshuo Chen %A Siawpeng Er %A Wenjing Liao %A Tong Zhang %A Tuo Zhao %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-liu22c %I PMLR %P 13669--13703 %U https://proceedings.mlr.press/v162/liu22c.html %V 162 %X Overparameterized neural networks enjoy great representation power on complex data, and more importantly yield sufficiently smooth output, which is crucial to their generalization and robustness. Most existing function approximation theories suggest that with sufficiently many parameters, neural networks can well approximate certain classes of functions in terms of the function value. The neural network themselves, however, can be highly nonsmooth. To bridge this gap, we take convolutional residual networks (ConvResNets) as an example, and prove that large ConvResNets can not only approximate a target function in terms of function value, but also exhibit sufficient first-order smoothness. Moreover, we extend our theory to approximating functions supported on a low-dimensional manifold. Our theory partially justifies the benefits of using deep and wide networks in practice. Numerical experiments on adversarial robust image classification are provided to support our theory.
APA
Liu, H., Chen, M., Er, S., Liao, W., Zhang, T. & Zhao, T.. (2022). Benefits of Overparameterized Convolutional Residual Networks: Function Approximation under Smoothness Constraint. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:13669-13703 Available from https://proceedings.mlr.press/v162/liu22c.html.

Related Material