[edit]
The AL$\ell_0$CORE Tensor Decomposition for Sparse Count Data
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4654-4662, 2024.
Abstract
This paper introduces AL$\ell_0$CORE, a new form of probabilistic non-negative tensor decomposition. AL$\ell_0$CORE is a Tucker decomposition that constrains the number of non-zero elements (i.e., the $\ell_0$-norm) of the core tensor to be at most $Q$. While the user dictates the total budget $Q$, the locations and values of the non-zero elements are latent variables allocated across the core tensor during inference. AL$\ell_0$CORE—i.e., allocated $\ell_0$-constrained core—thus enjoys both the computational tractability of canonical polyadic (CP) decomposition and the qualitatively appealing latent structure of Tucker. In a suite of real-data experiments, we demonstrate that AL$\ell_0$CORE typically requires only tiny fractions (e.g., 1%) of the core to achieve the same results as Tucker at a correspondingly small fraction of the cost.