Open Problem: Fixed-Parameter Tractability of Zonotope Problems

Vincent Froese, Moritz Grillo, Christoph Hertrich, Martin Skutella
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:6210-6214, 2025.

Abstract

Neural networks with ReLU activation play a key role in modern machine learning. Understanding the functions represented by ReLU networks is a major topic in current research. Recent results are achieved via connections to tropical geometry based on a duality between convex piecewise linear functions and polytopes. It turns out that several questions about properties of functions computed by ReLU neural networks can be answered by solving certain problems on special polytopes called zonotopes. For example, computing the Lipschitz constant of a ReLU network with one hidden layer corresponds to norm maximization over a zonotope. Moreover, deciding whether the ReLU network attains a positive output is equivalent to zonotope non-containment. These problems are known to be NP-hard in general but polynomial-time solvable if the input dimension is constant. However, it is open whether they are \emph{fixed-parameter tractable} (FPT) with respect to the input dimension $d$, that is, solvable in $f(d)\cdot n^{O(1)}$ time for some function $f$ solely depending on $d$. Notably, these zonotope problems also arise in other areas such as robotics and control, reachability analysis, pattern recognition, signal processing or political analysis. Thus, settling their parameterized complexity status is of broad interest.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-froese25b, title = {Open Problem: Fixed-Parameter Tractability of Zonotope Problems}, author = {Froese, Vincent and Grillo, Moritz and Hertrich, Christoph and Skutella, Martin}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {6210--6214}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/froese25b/froese25b.pdf}, url = {https://proceedings.mlr.press/v291/froese25b.html}, abstract = {Neural networks with ReLU activation play a key role in modern machine learning. Understanding the functions represented by ReLU networks is a major topic in current research. Recent results are achieved via connections to tropical geometry based on a duality between convex piecewise linear functions and polytopes. It turns out that several questions about properties of functions computed by ReLU neural networks can be answered by solving certain problems on special polytopes called zonotopes. For example, computing the Lipschitz constant of a ReLU network with one hidden layer corresponds to norm maximization over a zonotope. Moreover, deciding whether the ReLU network attains a positive output is equivalent to zonotope non-containment. These problems are known to be NP-hard in general but polynomial-time solvable if the input dimension is constant. However, it is open whether they are \emph{fixed-parameter tractable} (FPT) with respect to the input dimension $d$, that is, solvable in $f(d)\cdot n^{O(1)}$ time for some function $f$ solely depending on $d$. Notably, these zonotope problems also arise in other areas such as robotics and control, reachability analysis, pattern recognition, signal processing or political analysis. Thus, settling their parameterized complexity status is of broad interest. } }
Endnote
%0 Conference Paper %T Open Problem: Fixed-Parameter Tractability of Zonotope Problems %A Vincent Froese %A Moritz Grillo %A Christoph Hertrich %A Martin Skutella %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-froese25b %I PMLR %P 6210--6214 %U https://proceedings.mlr.press/v291/froese25b.html %V 291 %X Neural networks with ReLU activation play a key role in modern machine learning. Understanding the functions represented by ReLU networks is a major topic in current research. Recent results are achieved via connections to tropical geometry based on a duality between convex piecewise linear functions and polytopes. It turns out that several questions about properties of functions computed by ReLU neural networks can be answered by solving certain problems on special polytopes called zonotopes. For example, computing the Lipschitz constant of a ReLU network with one hidden layer corresponds to norm maximization over a zonotope. Moreover, deciding whether the ReLU network attains a positive output is equivalent to zonotope non-containment. These problems are known to be NP-hard in general but polynomial-time solvable if the input dimension is constant. However, it is open whether they are \emph{fixed-parameter tractable} (FPT) with respect to the input dimension $d$, that is, solvable in $f(d)\cdot n^{O(1)}$ time for some function $f$ solely depending on $d$. Notably, these zonotope problems also arise in other areas such as robotics and control, reachability analysis, pattern recognition, signal processing or political analysis. Thus, settling their parameterized complexity status is of broad interest.
APA
Froese, V., Grillo, M., Hertrich, C. & Skutella, M.. (2025). Open Problem: Fixed-Parameter Tractability of Zonotope Problems. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:6210-6214 Available from https://proceedings.mlr.press/v291/froese25b.html.

Related Material