Optimal learning via local entropies and sample compression

Zhivotovskiy Nikita
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:2023-2065, 2017.

Abstract

Under margin assumptions, we prove several risk bounds, represented via the distribution dependent local entropies of the classes or the sizes of specific sample compression schemes. In some cases, our guarantees are optimal up to constant factors for families of classes. We discuss limitations of our approach and give several applications. In particular, we provide a new tight PAC bound for the hard-margin SVM, an extended analysis of certain empirical risk minimizers under log-concave distributions, a new variant of an online to batch conversion, and distribution dependent localized bounds in the aggregation framework. As a part of our results, we give a new upper bound for the uniform deviations under Bernstein assumptions, which may be of independent interest. The proofs for the sample compression schemes are based on the moment method combined with the analysis of voting algorithms.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-nikita17a, title = {Optimal learning via local entropies and sample compression}, author = {Nikita, Zhivotovskiy}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {2023--2065}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/nikita17a/nikita17a.pdf}, url = {https://proceedings.mlr.press/v65/nikita17a.html}, abstract = {Under margin assumptions, we prove several risk bounds, represented via the distribution dependent local entropies of the classes or the sizes of specific sample compression schemes. In some cases, our guarantees are optimal up to constant factors for families of classes. We discuss limitations of our approach and give several applications. In particular, we provide a new tight PAC bound for the hard-margin SVM, an extended analysis of certain empirical risk minimizers under log-concave distributions, a new variant of an online to batch conversion, and distribution dependent localized bounds in the aggregation framework. As a part of our results, we give a new upper bound for the uniform deviations under Bernstein assumptions, which may be of independent interest. The proofs for the sample compression schemes are based on the moment method combined with the analysis of voting algorithms.} }
Endnote
%0 Conference Paper %T Optimal learning via local entropies and sample compression %A Zhivotovskiy Nikita %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-nikita17a %I PMLR %P 2023--2065 %U https://proceedings.mlr.press/v65/nikita17a.html %V 65 %X Under margin assumptions, we prove several risk bounds, represented via the distribution dependent local entropies of the classes or the sizes of specific sample compression schemes. In some cases, our guarantees are optimal up to constant factors for families of classes. We discuss limitations of our approach and give several applications. In particular, we provide a new tight PAC bound for the hard-margin SVM, an extended analysis of certain empirical risk minimizers under log-concave distributions, a new variant of an online to batch conversion, and distribution dependent localized bounds in the aggregation framework. As a part of our results, we give a new upper bound for the uniform deviations under Bernstein assumptions, which may be of independent interest. The proofs for the sample compression schemes are based on the moment method combined with the analysis of voting algorithms.
APA
Nikita, Z.. (2017). Optimal learning via local entropies and sample compression. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:2023-2065 Available from https://proceedings.mlr.press/v65/nikita17a.html.

Related Material