A Tale of Two Metrics: Simultaneous Bounds on Competitiveness and Regret

Lachlan Andrew, Siddharth Barman, Katrina Ligett, Minghong Lin, Adam Meyerson, Alan Roytman, Adam Wierman
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:741-763, 2013.

Abstract

We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics - no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly.

Cite this Paper


BibTeX
@InProceedings{pmlr-v30-Andrew13, title = {A Tale of Two Metrics: Simultaneous Bounds on Competitiveness and Regret}, author = {Andrew, Lachlan and Barman, Siddharth and Ligett, Katrina and Lin, Minghong and Meyerson, Adam and Roytman, Alan and Wierman, Adam}, booktitle = {Proceedings of the 26th Annual Conference on Learning Theory}, pages = {741--763}, year = {2013}, editor = {Shalev-Shwartz, Shai and Steinwart, Ingo}, volume = {30}, series = {Proceedings of Machine Learning Research}, address = {Princeton, NJ, USA}, month = {12--14 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v30/Andrew13.pdf}, url = {https://proceedings.mlr.press/v30/Andrew13.html}, abstract = {We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics - no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly.} }
Endnote
%0 Conference Paper %T A Tale of Two Metrics: Simultaneous Bounds on Competitiveness and Regret %A Lachlan Andrew %A Siddharth Barman %A Katrina Ligett %A Minghong Lin %A Adam Meyerson %A Alan Roytman %A Adam Wierman %B Proceedings of the 26th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2013 %E Shai Shalev-Shwartz %E Ingo Steinwart %F pmlr-v30-Andrew13 %I PMLR %P 741--763 %U https://proceedings.mlr.press/v30/Andrew13.html %V 30 %X We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics - no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly.
RIS
TY - CPAPER TI - A Tale of Two Metrics: Simultaneous Bounds on Competitiveness and Regret AU - Lachlan Andrew AU - Siddharth Barman AU - Katrina Ligett AU - Minghong Lin AU - Adam Meyerson AU - Alan Roytman AU - Adam Wierman BT - Proceedings of the 26th Annual Conference on Learning Theory DA - 2013/06/13 ED - Shai Shalev-Shwartz ED - Ingo Steinwart ID - pmlr-v30-Andrew13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 30 SP - 741 EP - 763 L1 - http://proceedings.mlr.press/v30/Andrew13.pdf UR - https://proceedings.mlr.press/v30/Andrew13.html AB - We consider algorithms for “smoothed online convex optimization” problems, a variant of the class of online convex optimization problems that is strongly related to metrical task systems. Prior literature on these problems has focused on two performance metrics: regret and the competitive ratio. There exist known algorithms with sublinear regret and known algorithms with constant competitive ratios; however, no known algorithm achieves both simultaneously. We show that this is due to a fundamental incompatibility between these two metrics - no algorithm (deterministic or randomized) can achieve sublinear regret and a constant competitive ratio, even in the case when the objective functions are linear. However, we also exhibit an algorithm that, for the important special case of one dimensional decision spaces, provides sublinear regret while maintaining a competitive ratio that grows arbitrarily slowly. ER -
APA
Andrew, L., Barman, S., Ligett, K., Lin, M., Meyerson, A., Roytman, A. & Wierman, A.. (2013). A Tale of Two Metrics: Simultaneous Bounds on Competitiveness and Regret. Proceedings of the 26th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 30:741-763 Available from https://proceedings.mlr.press/v30/Andrew13.html.

Related Material