Stochastic Variance-Reduced Algorithms for PCA with Arbitrary Mini-Batch Sizes

Cheolmin Kim, Diego Klabjan
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4302-4312, 2020.

Abstract

We present two stochastic variance-reduced PCA algorithms and their convergence analyses. By deriving explicit forms of step size, epoch length and batch size to ensure the optimal runtime, we show that the proposed algorithms can attain the optimal runtime with any batch sizes. Also, we establish global convergence of the algorithms based on a novel approach, which studies the optimality gap as a ratio of two expectation terms. The framework in our analysis is general and can be used to analyze other stochastic variance-reduced PCA algorithms and improve their analyses. Moreover, we introduce practical implementations of the algorithms which do not require hyper-parameters. The experimental results show that the proposed methodsd outperform other stochastic variance-reduced PCA algorithms regardless of the batch size.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-kim20e, title = {Stochastic Variance-Reduced Algorithms for PCA with Arbitrary Mini-Batch Sizes}, author = {Kim, Cheolmin and Klabjan, Diego}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {4302--4312}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/kim20e/kim20e.pdf}, url = {https://proceedings.mlr.press/v108/kim20e.html}, abstract = {We present two stochastic variance-reduced PCA algorithms and their convergence analyses. By deriving explicit forms of step size, epoch length and batch size to ensure the optimal runtime, we show that the proposed algorithms can attain the optimal runtime with any batch sizes. Also, we establish global convergence of the algorithms based on a novel approach, which studies the optimality gap as a ratio of two expectation terms. The framework in our analysis is general and can be used to analyze other stochastic variance-reduced PCA algorithms and improve their analyses. Moreover, we introduce practical implementations of the algorithms which do not require hyper-parameters. The experimental results show that the proposed methodsd outperform other stochastic variance-reduced PCA algorithms regardless of the batch size.} }
Endnote
%0 Conference Paper %T Stochastic Variance-Reduced Algorithms for PCA with Arbitrary Mini-Batch Sizes %A Cheolmin Kim %A Diego Klabjan %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-kim20e %I PMLR %P 4302--4312 %U https://proceedings.mlr.press/v108/kim20e.html %V 108 %X We present two stochastic variance-reduced PCA algorithms and their convergence analyses. By deriving explicit forms of step size, epoch length and batch size to ensure the optimal runtime, we show that the proposed algorithms can attain the optimal runtime with any batch sizes. Also, we establish global convergence of the algorithms based on a novel approach, which studies the optimality gap as a ratio of two expectation terms. The framework in our analysis is general and can be used to analyze other stochastic variance-reduced PCA algorithms and improve their analyses. Moreover, we introduce practical implementations of the algorithms which do not require hyper-parameters. The experimental results show that the proposed methodsd outperform other stochastic variance-reduced PCA algorithms regardless of the batch size.
APA
Kim, C. & Klabjan, D.. (2020). Stochastic Variance-Reduced Algorithms for PCA with Arbitrary Mini-Batch Sizes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:4302-4312 Available from https://proceedings.mlr.press/v108/kim20e.html.

Related Material