Nonlinear Acceleration of PrimalDual Algorithms
[edit]
Proceedings of Machine Learning Research, PMLR 89:739747, 2019.
Abstract
We describe a convergence acceleration scheme for multistep optimization algorithms. The extrapolated solution is written as a nonlinear average of the iterates produced by the original optimization algorithm. Our scheme does not need the underlying fixedpoint operator to be symmetric, hence handles e.g. algorithms with momentum terms such as Nesterov’s accelerated method, or primaldual methods such as ChambollePock. The weights are computed via a simple linear system and we analyze performance in both online and offline modes. We use Crouzeix’s conjecture to show that acceleration is controlled by the solution of a Chebyshev problem on the numerical range of a nonsymmetric operator modelling the behavior of iterates near the optimum. Numerical experiments are detailed on image processing and logistic regression problems.
Related Material


