A nonasymptotic law of iterated logarithm for general M-estimators

Nicolas Schreuder, Victor-Emmanuel Brunel, Arnak Dalalyan
; Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1331-1341, 2020.

Abstract

M-estimators are ubiquitous in machine learning and statistical learning theory. They are used both for defining prediction strategies and for evaluating their precision. In this paper, we propose the first non-asymptotic ’any-time’ deviation bounds for general M-estimators, where ’any-time’ means that the bound holds with a prescribed probability for every sample size. These bounds are non-asymptotic versions of the law of iterated logarithm. They are established under general assumptions such as Lipschitz continuity of the loss function and (local) curvature of thepopulation risk. These conditions are satisfied for most examples used in machine learning, including those ensuring robustness to outliers and to heavy tailed distributions. As an example of application, we consider the problem of best arm identification in a stochastic multi-arm bandit setting. We show that the established bound can be converted into a new algorithm, with provably optimal theoretical guarantees. Numerical experiments illustrating the validity of the algorithm are reported.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-dalalyan20a, title = {A nonasymptotic law of iterated logarithm for general M-estimators}, author = {Schreuder, Nicolas and Brunel, Victor-Emmanuel and Dalalyan, Arnak}, pages = {1331--1341}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, address = {Online}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/dalalyan20a/dalalyan20a.pdf}, url = {http://proceedings.mlr.press/v108/dalalyan20a.html}, abstract = {M-estimators are ubiquitous in machine learning and statistical learning theory. They are used both for defining prediction strategies and for evaluating their precision. In this paper, we propose the first non-asymptotic ’any-time’ deviation bounds for general M-estimators, where ’any-time’ means that the bound holds with a prescribed probability for every sample size. These bounds are non-asymptotic versions of the law of iterated logarithm. They are established under general assumptions such as Lipschitz continuity of the loss function and (local) curvature of thepopulation risk. These conditions are satisfied for most examples used in machine learning, including those ensuring robustness to outliers and to heavy tailed distributions. As an example of application, we consider the problem of best arm identification in a stochastic multi-arm bandit setting. We show that the established bound can be converted into a new algorithm, with provably optimal theoretical guarantees. Numerical experiments illustrating the validity of the algorithm are reported.} }
Endnote
%0 Conference Paper %T A nonasymptotic law of iterated logarithm for general M-estimators %A Nicolas Schreuder %A Victor-Emmanuel Brunel %A Arnak Dalalyan %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-dalalyan20a %I PMLR %J Proceedings of Machine Learning Research %P 1331--1341 %U http://proceedings.mlr.press %V 108 %W PMLR %X M-estimators are ubiquitous in machine learning and statistical learning theory. They are used both for defining prediction strategies and for evaluating their precision. In this paper, we propose the first non-asymptotic ’any-time’ deviation bounds for general M-estimators, where ’any-time’ means that the bound holds with a prescribed probability for every sample size. These bounds are non-asymptotic versions of the law of iterated logarithm. They are established under general assumptions such as Lipschitz continuity of the loss function and (local) curvature of thepopulation risk. These conditions are satisfied for most examples used in machine learning, including those ensuring robustness to outliers and to heavy tailed distributions. As an example of application, we consider the problem of best arm identification in a stochastic multi-arm bandit setting. We show that the established bound can be converted into a new algorithm, with provably optimal theoretical guarantees. Numerical experiments illustrating the validity of the algorithm are reported.
APA
Schreuder, N., Brunel, V. & Dalalyan, A.. (2020). A nonasymptotic law of iterated logarithm for general M-estimators. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in PMLR 108:1331-1341

Related Material