On the number of linear functions composing deep neural network: Towards a refined definition of neural networks complexity

Yuuki Takai, Akiyoshi Sannai, Matthieu Cordonnier
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:3799-3807, 2021.

Abstract

The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions. This complexity measure is quite relevant to understand general properties of the expressivity of neural networks such as the benefit of depth over width. Nevertheless, it appears limited when it comes to comparing the expressivity of different network architectures. This lack becomes particularly prominent when considering permutation-invariant networks, due to the symmetrical redundancy among the linear regions. To tackle this, we propose a refined definition of piecewise linear function complexity: instead of counting the number of linear regions directly, we first introduce an equivalence relation among the linear functions composing a piecewise linear function and then count those linear functions relative to that equivalence relation. Our new complexity measure can clearly distinguish between the two aforementioned models, is consistent with the classical measure, and increases exponentially with depth.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-takai21a, title = { On the number of linear functions composing deep neural network: Towards a refined definition of neural networks complexity }, author = {Takai, Yuuki and Sannai, Akiyoshi and Cordonnier, Matthieu}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {3799--3807}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/takai21a/takai21a.pdf}, url = {https://proceedings.mlr.press/v130/takai21a.html}, abstract = { The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions. This complexity measure is quite relevant to understand general properties of the expressivity of neural networks such as the benefit of depth over width. Nevertheless, it appears limited when it comes to comparing the expressivity of different network architectures. This lack becomes particularly prominent when considering permutation-invariant networks, due to the symmetrical redundancy among the linear regions. To tackle this, we propose a refined definition of piecewise linear function complexity: instead of counting the number of linear regions directly, we first introduce an equivalence relation among the linear functions composing a piecewise linear function and then count those linear functions relative to that equivalence relation. Our new complexity measure can clearly distinguish between the two aforementioned models, is consistent with the classical measure, and increases exponentially with depth. } }
Endnote
%0 Conference Paper %T On the number of linear functions composing deep neural network: Towards a refined definition of neural networks complexity %A Yuuki Takai %A Akiyoshi Sannai %A Matthieu Cordonnier %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-takai21a %I PMLR %P 3799--3807 %U https://proceedings.mlr.press/v130/takai21a.html %V 130 %X The classical approach to measure the expressive power of deep neural networks with piecewise linear activations is based on counting their maximum number of linear regions. This complexity measure is quite relevant to understand general properties of the expressivity of neural networks such as the benefit of depth over width. Nevertheless, it appears limited when it comes to comparing the expressivity of different network architectures. This lack becomes particularly prominent when considering permutation-invariant networks, due to the symmetrical redundancy among the linear regions. To tackle this, we propose a refined definition of piecewise linear function complexity: instead of counting the number of linear regions directly, we first introduce an equivalence relation among the linear functions composing a piecewise linear function and then count those linear functions relative to that equivalence relation. Our new complexity measure can clearly distinguish between the two aforementioned models, is consistent with the classical measure, and increases exponentially with depth.
APA
Takai, Y., Sannai, A. & Cordonnier, M.. (2021). On the number of linear functions composing deep neural network: Towards a refined definition of neural networks complexity . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:3799-3807 Available from https://proceedings.mlr.press/v130/takai21a.html.

Related Material