Convex Geometry of ReLU-layers, Injectivity on the Ball and Local Reconstruction

Daniel Haider, Martin Ehler, Peter Balazs
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:12339-12350, 2023.

Abstract

The paper uses a frame-theoretic setting to study the injectivity of a ReLU-layer on the closed ball of $\mathbb{R}^n$ and its non-negative part. In particular, the interplay between the radius of the ball and the bias vector is emphasized. Together with a perspective from convex geometry, this leads to a computationally feasible method of verifying the injectivity of a ReLU-layer under reasonable restrictions in terms of an upper bound of the bias vector. Explicit reconstruction formulas are provided, inspired by the duality concept from frame theory. All this gives rise to the possibility of quantifying the invertibility of a ReLU-layer and a concrete reconstruction algorithm for any input vector on the ball.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-haider23a, title = {Convex Geometry of {R}e{LU}-layers, Injectivity on the Ball and Local Reconstruction}, author = {Haider, Daniel and Ehler, Martin and Balazs, Peter}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {12339--12350}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/haider23a/haider23a.pdf}, url = {https://proceedings.mlr.press/v202/haider23a.html}, abstract = {The paper uses a frame-theoretic setting to study the injectivity of a ReLU-layer on the closed ball of $\mathbb{R}^n$ and its non-negative part. In particular, the interplay between the radius of the ball and the bias vector is emphasized. Together with a perspective from convex geometry, this leads to a computationally feasible method of verifying the injectivity of a ReLU-layer under reasonable restrictions in terms of an upper bound of the bias vector. Explicit reconstruction formulas are provided, inspired by the duality concept from frame theory. All this gives rise to the possibility of quantifying the invertibility of a ReLU-layer and a concrete reconstruction algorithm for any input vector on the ball.} }
Endnote
%0 Conference Paper %T Convex Geometry of ReLU-layers, Injectivity on the Ball and Local Reconstruction %A Daniel Haider %A Martin Ehler %A Peter Balazs %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-haider23a %I PMLR %P 12339--12350 %U https://proceedings.mlr.press/v202/haider23a.html %V 202 %X The paper uses a frame-theoretic setting to study the injectivity of a ReLU-layer on the closed ball of $\mathbb{R}^n$ and its non-negative part. In particular, the interplay between the radius of the ball and the bias vector is emphasized. Together with a perspective from convex geometry, this leads to a computationally feasible method of verifying the injectivity of a ReLU-layer under reasonable restrictions in terms of an upper bound of the bias vector. Explicit reconstruction formulas are provided, inspired by the duality concept from frame theory. All this gives rise to the possibility of quantifying the invertibility of a ReLU-layer and a concrete reconstruction algorithm for any input vector on the ball.
APA
Haider, D., Ehler, M. & Balazs, P.. (2023). Convex Geometry of ReLU-layers, Injectivity on the Ball and Local Reconstruction. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:12339-12350 Available from https://proceedings.mlr.press/v202/haider23a.html.

Related Material