Approximating Lipschitz continuous functions with GroupSort neural networks

Ugo Tanielian, Gerard Biau
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:442-450, 2021.

Abstract

Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants. Motivated by these observations, we study the recently introduced GroupSort neural networks, with constraints on the weights, and make a theoretical step towards a better understanding of their expressive power. We show in particular how these networks can represent any Lipschitz continuous piecewise linear functions. We also prove that they are well-suited for approximating Lipschitz continuous functions and exhibit upper bounds on both the depth and size. To conclude, the efficiency of GroupSort networks compared with more standard ReLU networks is illustrated in a set of synthetic experiments.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-tanielian21a, title = { Approximating Lipschitz continuous functions with GroupSort neural networks }, author = {Tanielian, Ugo and Biau, Gerard}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {442--450}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/tanielian21a/tanielian21a.pdf}, url = {https://proceedings.mlr.press/v130/tanielian21a.html}, abstract = { Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants. Motivated by these observations, we study the recently introduced GroupSort neural networks, with constraints on the weights, and make a theoretical step towards a better understanding of their expressive power. We show in particular how these networks can represent any Lipschitz continuous piecewise linear functions. We also prove that they are well-suited for approximating Lipschitz continuous functions and exhibit upper bounds on both the depth and size. To conclude, the efficiency of GroupSort networks compared with more standard ReLU networks is illustrated in a set of synthetic experiments. } }
Endnote
%0 Conference Paper %T Approximating Lipschitz continuous functions with GroupSort neural networks %A Ugo Tanielian %A Gerard Biau %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-tanielian21a %I PMLR %P 442--450 %U https://proceedings.mlr.press/v130/tanielian21a.html %V 130 %X Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants. Motivated by these observations, we study the recently introduced GroupSort neural networks, with constraints on the weights, and make a theoretical step towards a better understanding of their expressive power. We show in particular how these networks can represent any Lipschitz continuous piecewise linear functions. We also prove that they are well-suited for approximating Lipschitz continuous functions and exhibit upper bounds on both the depth and size. To conclude, the efficiency of GroupSort networks compared with more standard ReLU networks is illustrated in a set of synthetic experiments.
APA
Tanielian, U. & Biau, G.. (2021). Approximating Lipschitz continuous functions with GroupSort neural networks . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:442-450 Available from https://proceedings.mlr.press/v130/tanielian21a.html.

Related Material