Localization and Adaptation in Online Learning

Alexander Rakhlin, Ohad Shamir, Karthik Sridharan
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:516-526, 2013.

Abstract

We introduce a formalism of localization for online learning problems, which, similarly to statistical learning theory, can be used to obtain fast rates. In particular, we introduce local sequential Rademacher complexities and other local measures. Based on the idea of relaxations for deriving algorithms, we provide a template method that takes advantage of localization. Furthermore, we build a general adaptive method that can take advantage of the suboptimality of the observed sequence. We illustrate the utility of the introduced concepts on several problems. Among them is a novel upper bound on regret in terms of classical Rademacher complexity when the data are i.i.d.

Cite this Paper


BibTeX
@InProceedings{pmlr-v31-rakhlin13a, title = {Localization and Adaptation in Online Learning}, author = {Rakhlin, Alexander and Shamir, Ohad and Sridharan, Karthik}, booktitle = {Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics}, pages = {516--526}, year = {2013}, editor = {Carvalho, Carlos M. and Ravikumar, Pradeep}, volume = {31}, series = {Proceedings of Machine Learning Research}, address = {Scottsdale, Arizona, USA}, month = {29 Apr--01 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v31/rakhlin13a.pdf}, url = {https://proceedings.mlr.press/v31/rakhlin13a.html}, abstract = {We introduce a formalism of localization for online learning problems, which, similarly to statistical learning theory, can be used to obtain fast rates. In particular, we introduce local sequential Rademacher complexities and other local measures. Based on the idea of relaxations for deriving algorithms, we provide a template method that takes advantage of localization. Furthermore, we build a general adaptive method that can take advantage of the suboptimality of the observed sequence. We illustrate the utility of the introduced concepts on several problems. Among them is a novel upper bound on regret in terms of classical Rademacher complexity when the data are i.i.d.} }
Endnote
%0 Conference Paper %T Localization and Adaptation in Online Learning %A Alexander Rakhlin %A Ohad Shamir %A Karthik Sridharan %B Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2013 %E Carlos M. Carvalho %E Pradeep Ravikumar %F pmlr-v31-rakhlin13a %I PMLR %P 516--526 %U https://proceedings.mlr.press/v31/rakhlin13a.html %V 31 %X We introduce a formalism of localization for online learning problems, which, similarly to statistical learning theory, can be used to obtain fast rates. In particular, we introduce local sequential Rademacher complexities and other local measures. Based on the idea of relaxations for deriving algorithms, we provide a template method that takes advantage of localization. Furthermore, we build a general adaptive method that can take advantage of the suboptimality of the observed sequence. We illustrate the utility of the introduced concepts on several problems. Among them is a novel upper bound on regret in terms of classical Rademacher complexity when the data are i.i.d.
RIS
TY - CPAPER TI - Localization and Adaptation in Online Learning AU - Alexander Rakhlin AU - Ohad Shamir AU - Karthik Sridharan BT - Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics DA - 2013/04/29 ED - Carlos M. Carvalho ED - Pradeep Ravikumar ID - pmlr-v31-rakhlin13a PB - PMLR DP - Proceedings of Machine Learning Research VL - 31 SP - 516 EP - 526 L1 - http://proceedings.mlr.press/v31/rakhlin13a.pdf UR - https://proceedings.mlr.press/v31/rakhlin13a.html AB - We introduce a formalism of localization for online learning problems, which, similarly to statistical learning theory, can be used to obtain fast rates. In particular, we introduce local sequential Rademacher complexities and other local measures. Based on the idea of relaxations for deriving algorithms, we provide a template method that takes advantage of localization. Furthermore, we build a general adaptive method that can take advantage of the suboptimality of the observed sequence. We illustrate the utility of the introduced concepts on several problems. Among them is a novel upper bound on regret in terms of classical Rademacher complexity when the data are i.i.d. ER -
APA
Rakhlin, A., Shamir, O. & Sridharan, K.. (2013). Localization and Adaptation in Online Learning. Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 31:516-526 Available from https://proceedings.mlr.press/v31/rakhlin13a.html.

Related Material