Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements

Eren C. Kızıldağ
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:649-652, 2025.

Abstract

We investigate the sample complexity of recovering tensors with low symmetric rank from sym- metric rank-one measurements, a setting particularly motivated by the study of higher-order inter- actions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved proba- bility calculations. To address these challenges, we employ the Carbery-Wright inequality, a power- ful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano’s inequality, and discuss broader implications of our results for two-layer polynomial networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-kizildag25a, title = {{Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements}}, author = {K{\i}z{\i}lda\u{g}, Eren C.}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {649--652}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/kizildag25a/kizildag25a.pdf}, url = {https://proceedings.mlr.press/v272/kizildag25a.html}, abstract = {We investigate the sample complexity of recovering tensors with low symmetric rank from sym- metric rank-one measurements, a setting particularly motivated by the study of higher-order inter- actions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved proba- bility calculations. To address these challenges, we employ the Carbery-Wright inequality, a power- ful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano’s inequality, and discuss broader implications of our results for two-layer polynomial networks.} }
Endnote
%0 Conference Paper %T Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements %A Eren C. Kızıldağ %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-kizildag25a %I PMLR %P 649--652 %U https://proceedings.mlr.press/v272/kizildag25a.html %V 272 %X We investigate the sample complexity of recovering tensors with low symmetric rank from sym- metric rank-one measurements, a setting particularly motivated by the study of higher-order inter- actions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved proba- bility calculations. To address these challenges, we employ the Carbery-Wright inequality, a power- ful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano’s inequality, and discuss broader implications of our results for two-layer polynomial networks.
APA
Kızıldağ, E.C.. (2025). Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:649-652 Available from https://proceedings.mlr.press/v272/kizildag25a.html.

Related Material