Large Average Subtensor Problem: Ground-State, Algorithms, and Algorithmic Barriers

Abhishek Hegade K. R., Eren C. Kizildag
Proceedings of The 37th International Conference on Algorithmic Learning Theory, PMLR 313:1-2, 2026.

Abstract

We introduce the large average subtensor problem: given an order-$p$ tensor over $\mathbb{R}^{N\times \cdots \times N}$ with i.i.d. standard normal entries and a $k\in\mathbb{N}$, algorithmically find a $k\times \cdots \times k$ subtensor with a large average entry. This generalizes the large average submatrix problem, a key model closely related to biclustering and high-dimensional data analysis, to tensors. For the submatrix case, \citet*{bhamidi2017energy} explicitly highlight the regime $k=\Theta(N)$ as an intriguing open question. Addressing the regime $k=\Theta(N)$ for tensors, we establish that the largest average entry concentrates around an explicit value $E_{\mathrm{max}}$, provided that the tensor order $p$ is sufficiently large. Furthermore, we prove that for any $\gamma>0$ and large $p$, this model exhibits multi Overlap Gap Property ($m$-OGP) above the threshold $\gamma E_{\mathrm{max}}$. The $m$-OGP is a rigorous barrier for a broad class of algorithms exhibiting input stability. These results hold for both $k=\Theta(N)$ and $k=o(N)$. For smaller values of $k$, we also propose a polynomial-time algorithm that finds a subtensor with an average entry of order $\Theta_p(\frac{1}{\sqrt{p}})E_{\mathrm{max}}$. In particular, the $m$-OGP is asymptotically sharp: onset of the $m$-OGP and the algorithmic threshold match as $p$ grows. Our results show that while the case $k=\Theta(N)$ remains open for submatrices, it can be rigorously analyzed for tensors in the large $p$ regime. This is achieved by interpreting the model as a Boolean spin glass and drawing on insights from recent advances in the Ising $p$-spin glasses.

Cite this Paper


BibTeX
@InProceedings{pmlr-v313-r-26a, title = {Large Average Subtensor Problem: Ground-State, Algorithms, and Algorithmic Barriers}, author = {R., Abhishek Hegade K. and Kizildag, Eren C.}, booktitle = {Proceedings of The 37th International Conference on Algorithmic Learning Theory}, pages = {1--2}, year = {2026}, editor = {Telgarsky, Matus and Ullman, Jonathan}, volume = {313}, series = {Proceedings of Machine Learning Research}, month = {23--26 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v313/main/assets/r-26a/r-26a.pdf}, url = {https://proceedings.mlr.press/v313/r-26a.html}, abstract = {We introduce the large average subtensor problem: given an order-$p$ tensor over $\mathbb{R}^{N\times \cdots \times N}$ with i.i.d. standard normal entries and a $k\in\mathbb{N}$, algorithmically find a $k\times \cdots \times k$ subtensor with a large average entry. This generalizes the large average submatrix problem, a key model closely related to biclustering and high-dimensional data analysis, to tensors. For the submatrix case, \citet*{bhamidi2017energy} explicitly highlight the regime $k=\Theta(N)$ as an intriguing open question. Addressing the regime $k=\Theta(N)$ for tensors, we establish that the largest average entry concentrates around an explicit value $E_{\mathrm{max}}$, provided that the tensor order $p$ is sufficiently large. Furthermore, we prove that for any $\gamma>0$ and large $p$, this model exhibits multi Overlap Gap Property ($m$-OGP) above the threshold $\gamma E_{\mathrm{max}}$. The $m$-OGP is a rigorous barrier for a broad class of algorithms exhibiting input stability. These results hold for both $k=\Theta(N)$ and $k=o(N)$. For smaller values of $k$, we also propose a polynomial-time algorithm that finds a subtensor with an average entry of order $\Theta_p(\frac{1}{\sqrt{p}})E_{\mathrm{max}}$. In particular, the $m$-OGP is asymptotically sharp: onset of the $m$-OGP and the algorithmic threshold match as $p$ grows. Our results show that while the case $k=\Theta(N)$ remains open for submatrices, it can be rigorously analyzed for tensors in the large $p$ regime. This is achieved by interpreting the model as a Boolean spin glass and drawing on insights from recent advances in the Ising $p$-spin glasses.} }
Endnote
%0 Conference Paper %T Large Average Subtensor Problem: Ground-State, Algorithms, and Algorithmic Barriers %A Abhishek Hegade K. R. %A Eren C. Kizildag %B Proceedings of The 37th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2026 %E Matus Telgarsky %E Jonathan Ullman %F pmlr-v313-r-26a %I PMLR %P 1--2 %U https://proceedings.mlr.press/v313/r-26a.html %V 313 %X We introduce the large average subtensor problem: given an order-$p$ tensor over $\mathbb{R}^{N\times \cdots \times N}$ with i.i.d. standard normal entries and a $k\in\mathbb{N}$, algorithmically find a $k\times \cdots \times k$ subtensor with a large average entry. This generalizes the large average submatrix problem, a key model closely related to biclustering and high-dimensional data analysis, to tensors. For the submatrix case, \citet*{bhamidi2017energy} explicitly highlight the regime $k=\Theta(N)$ as an intriguing open question. Addressing the regime $k=\Theta(N)$ for tensors, we establish that the largest average entry concentrates around an explicit value $E_{\mathrm{max}}$, provided that the tensor order $p$ is sufficiently large. Furthermore, we prove that for any $\gamma>0$ and large $p$, this model exhibits multi Overlap Gap Property ($m$-OGP) above the threshold $\gamma E_{\mathrm{max}}$. The $m$-OGP is a rigorous barrier for a broad class of algorithms exhibiting input stability. These results hold for both $k=\Theta(N)$ and $k=o(N)$. For smaller values of $k$, we also propose a polynomial-time algorithm that finds a subtensor with an average entry of order $\Theta_p(\frac{1}{\sqrt{p}})E_{\mathrm{max}}$. In particular, the $m$-OGP is asymptotically sharp: onset of the $m$-OGP and the algorithmic threshold match as $p$ grows. Our results show that while the case $k=\Theta(N)$ remains open for submatrices, it can be rigorously analyzed for tensors in the large $p$ regime. This is achieved by interpreting the model as a Boolean spin glass and drawing on insights from recent advances in the Ising $p$-spin glasses.
APA
R., A.H.K. & Kizildag, E.C.. (2026). Large Average Subtensor Problem: Ground-State, Algorithms, and Algorithmic Barriers. Proceedings of The 37th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 313:1-2 Available from https://proceedings.mlr.press/v313/r-26a.html.

Related Material