Primal-Dual Rates and Certificates


Celestine Dünner, Simone Forte, Martin Takac, Martin Jaggi ;
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:783-792, 2016.


We propose an algorithm-independent framework to equip existing optimization methods with primal-dual certificates. Such certificates and corresponding rate of convergence guarantees are important for practitioners to diagnose progress, in particular in machine learning applications. We obtain new primal-dual convergence rates, e.g., for the Lasso as well as many L1, Elastic Net, group Lasso and TV-regularized problems. The theory applies to any norm-regularized generalized linear model. Our approach provides efficiently computable duality gaps which are globally defined, without modifying the original problems in the region of interest.

Related Material