Low coordinate degree algorithms II: Categorical signals and generalized stochastic block models

Dmitriy Kunisky
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:3486-3526, 2025.

Abstract

We study when low coordinate degree functions (LCDF)—linear combinations of functions depending on small subsets of entries of a vector—can test for the presence of categorical structure, including community structure and generalizations thereof, in high-dimensional data. This complements recent results studying the power of LCDF in testing for continuous structure like real-valued signals corrupted by additive noise. We study a general form of stochastic block model (SBM), where a population is assigned random labels and every $p$-tuple generates an observation according to an arbitrary probability measure associated to the $p$ labels of its members. We show that the performance of LCDF admits a unified analysis for this class of models. As applications, we prove tight lower bounds against LCDF for broad families of previously studied graph and uniform hypergraph SBMs, always matching suitable generalizations of the Kesten-Stigum threshold. We also prove tight lower bounds for group synchronization and abelian group sumset problems under the “truth-or-Haar” noise model, and give an improved analysis of Gaussian multi-frequency group synchronization. In most of these models, for some parameter settings our lower bounds give new evidence for conjectural statistical-to-computational gaps. Finally, interpreting some of our findings, we propose a new analogy between categorical and continuous signals: a general SBM as above behaves qualitatively like a spiked $p_*$-tensor model of a certain order $p_*$ depending on the parameters of the SBM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-kunisky25a, title = {Low coordinate degree algorithms II: Categorical signals and generalized stochastic block models}, author = {Kunisky, Dmitriy}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {3486--3526}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/kunisky25a/kunisky25a.pdf}, url = {https://proceedings.mlr.press/v291/kunisky25a.html}, abstract = {We study when low coordinate degree functions (LCDF)—linear combinations of functions depending on small subsets of entries of a vector—can test for the presence of categorical structure, including community structure and generalizations thereof, in high-dimensional data. This complements recent results studying the power of LCDF in testing for continuous structure like real-valued signals corrupted by additive noise. We study a general form of stochastic block model (SBM), where a population is assigned random labels and every $p$-tuple generates an observation according to an arbitrary probability measure associated to the $p$ labels of its members. We show that the performance of LCDF admits a unified analysis for this class of models. As applications, we prove tight lower bounds against LCDF for broad families of previously studied graph and uniform hypergraph SBMs, always matching suitable generalizations of the Kesten-Stigum threshold. We also prove tight lower bounds for group synchronization and abelian group sumset problems under the “truth-or-Haar” noise model, and give an improved analysis of Gaussian multi-frequency group synchronization. In most of these models, for some parameter settings our lower bounds give new evidence for conjectural statistical-to-computational gaps. Finally, interpreting some of our findings, we propose a new analogy between categorical and continuous signals: a general SBM as above behaves qualitatively like a spiked $p_*$-tensor model of a certain order $p_*$ depending on the parameters of the SBM.} }
Endnote
%0 Conference Paper %T Low coordinate degree algorithms II: Categorical signals and generalized stochastic block models %A Dmitriy Kunisky %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-kunisky25a %I PMLR %P 3486--3526 %U https://proceedings.mlr.press/v291/kunisky25a.html %V 291 %X We study when low coordinate degree functions (LCDF)—linear combinations of functions depending on small subsets of entries of a vector—can test for the presence of categorical structure, including community structure and generalizations thereof, in high-dimensional data. This complements recent results studying the power of LCDF in testing for continuous structure like real-valued signals corrupted by additive noise. We study a general form of stochastic block model (SBM), where a population is assigned random labels and every $p$-tuple generates an observation according to an arbitrary probability measure associated to the $p$ labels of its members. We show that the performance of LCDF admits a unified analysis for this class of models. As applications, we prove tight lower bounds against LCDF for broad families of previously studied graph and uniform hypergraph SBMs, always matching suitable generalizations of the Kesten-Stigum threshold. We also prove tight lower bounds for group synchronization and abelian group sumset problems under the “truth-or-Haar” noise model, and give an improved analysis of Gaussian multi-frequency group synchronization. In most of these models, for some parameter settings our lower bounds give new evidence for conjectural statistical-to-computational gaps. Finally, interpreting some of our findings, we propose a new analogy between categorical and continuous signals: a general SBM as above behaves qualitatively like a spiked $p_*$-tensor model of a certain order $p_*$ depending on the parameters of the SBM.
APA
Kunisky, D.. (2025). Low coordinate degree algorithms II: Categorical signals and generalized stochastic block models. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:3486-3526 Available from https://proceedings.mlr.press/v291/kunisky25a.html.

Related Material