Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

[edit]

Shai Shalev-Shwartz, Tong Zhang ;
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):64-72, 2014.

Abstract

We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

Related Material