[edit]
Stochastic Variance-Reduced Algorithms for PCA with Arbitrary Mini-Batch Sizes
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4302-4312, 2020.
Abstract
We present two stochastic variance-reduced PCA algorithms and their convergence analyses. By deriving explicit forms of step size, epoch length and batch size to ensure the optimal runtime, we show that the proposed algorithms can attain the optimal runtime with any batch sizes. Also, we establish global convergence of the algorithms based on a novel approach, which studies the optimality gap as a ratio of two expectation terms. The framework in our analysis is general and can be used to analyze other stochastic variance-reduced PCA algorithms and improve their analyses. Moreover, we introduce practical implementations of the algorithms which do not require hyper-parameters. The experimental results show that the proposed methodsd outperform other stochastic variance-reduced PCA algorithms regardless of the batch size.