[edit]
Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:649-652, 2025.
Abstract
We investigate the sample complexity of recovering tensors with low symmetric rank from sym- metric rank-one measurements, a setting particularly motivated by the study of higher-order inter- actions in statistics and the analysis of two-layer polynomial neural networks. Using a covering number argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, leading to involved proba- bility calculations. To address these challenges, we employ the Carbery-Wright inequality, a power- ful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomial expansions. Additionally, we provide a sample complexity lower bound via Fano’s inequality, and discuss broader implications of our results for two-layer polynomial networks.