Parameter-free Mirror Descent

Andrew Jacobsen, Ashok Cutkosky
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:4160-4211, 2022.

Abstract

We develop a modified online mirror descent framework that is suitable for building adaptive and parameter-free algorithms in unbounded domains. We leverage this technique to develop the first unconstrained online linear optimization algorithm achieving an optimal dynamic regret bound, and we further demonstrate that natural strategies based on Follow-the-Regularized-Leader are unable to achieve similar results. We also apply our mirror descent framework to build new parameter-free implicit updates, as well as a simplified and improved unconstrained scale-free algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v178-jacobsen22a, title = {Parameter-free Mirror Descent}, author = {Jacobsen, Andrew and Cutkosky, Ashok}, booktitle = {Proceedings of Thirty Fifth Conference on Learning Theory}, pages = {4160--4211}, year = {2022}, editor = {Loh, Po-Ling and Raginsky, Maxim}, volume = {178}, series = {Proceedings of Machine Learning Research}, month = {02--05 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v178/jacobsen22a/jacobsen22a.pdf}, url = {https://proceedings.mlr.press/v178/jacobsen22a.html}, abstract = {We develop a modified online mirror descent framework that is suitable for building adaptive and parameter-free algorithms in unbounded domains. We leverage this technique to develop the first unconstrained online linear optimization algorithm achieving an optimal dynamic regret bound, and we further demonstrate that natural strategies based on Follow-the-Regularized-Leader are unable to achieve similar results. We also apply our mirror descent framework to build new parameter-free implicit updates, as well as a simplified and improved unconstrained scale-free algorithm.} }
Endnote
%0 Conference Paper %T Parameter-free Mirror Descent %A Andrew Jacobsen %A Ashok Cutkosky %B Proceedings of Thirty Fifth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Po-Ling Loh %E Maxim Raginsky %F pmlr-v178-jacobsen22a %I PMLR %P 4160--4211 %U https://proceedings.mlr.press/v178/jacobsen22a.html %V 178 %X We develop a modified online mirror descent framework that is suitable for building adaptive and parameter-free algorithms in unbounded domains. We leverage this technique to develop the first unconstrained online linear optimization algorithm achieving an optimal dynamic regret bound, and we further demonstrate that natural strategies based on Follow-the-Regularized-Leader are unable to achieve similar results. We also apply our mirror descent framework to build new parameter-free implicit updates, as well as a simplified and improved unconstrained scale-free algorithm.
APA
Jacobsen, A. & Cutkosky, A.. (2022). Parameter-free Mirror Descent. Proceedings of Thirty Fifth Conference on Learning Theory, in Proceedings of Machine Learning Research 178:4160-4211 Available from https://proceedings.mlr.press/v178/jacobsen22a.html.

Related Material