[edit]
Open Problem: Fixed-Parameter Tractability of Zonotope Problems
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:6210-6214, 2025.
Abstract
Neural networks with ReLU activation play a key role in modern machine learning. Understanding the functions represented by ReLU networks is a major topic in current research. Recent results are achieved via connections to tropical geometry based on a duality between convex piecewise linear functions and polytopes. It turns out that several questions about properties of functions computed by ReLU neural networks can be answered by solving certain problems on special polytopes called zonotopes. For example, computing the Lipschitz constant of a ReLU network with one hidden layer corresponds to norm maximization over a zonotope. Moreover, deciding whether the ReLU network attains a positive output is equivalent to zonotope non-containment. These problems are known to be NP-hard in general but polynomial-time solvable if the input dimension is constant. However, it is open whether they are \emph{fixed-parameter tractable} (FPT) with respect to the input dimension $d$, that is, solvable in $f(d)\cdot n^{O(1)}$ time for some function $f$ solely depending on $d$. Notably, these zonotope problems also arise in other areas such as robotics and control, reachability analysis, pattern recognition, signal processing or political analysis. Thus, settling their parameterized complexity status is of broad interest.