[edit]
Non-Oblivious Performance of Random Projections
Proceedings of the 16th Asian Conference on Machine Learning, PMLR 260:1128-1143, 2025.
Abstract
Random projections are a cornerstone of high-dimensional computations. However, their analysis has proven both difficult and inadequate in capturing the empirically observed accuracy. To bridge this gap, this paper studies random projections from a novel perspective, focusing on data-dependent, that is, \emph{non-oblivious}, performance.
The key contribution is the precise and data-dependent accuracy analysis of Rademacher random projections, achieved through elegant geometric methods of independent interest, namely, \emph{Schur-concavity}. The result formally states the following property: the less spread-out the data is, the better the accuracy. This leads to notable improvements in accuracy guarantees for data characterized by sparsity or distributed with a small spread.
The key tool is a novel algebraic framework for proving Schur-concavity properties, which offers an alternative to derivative-based criteria commonly used in related studies. We demonstrate its value by providing an alternative proof for the extension of the celebrated Khintchine inequality.