Fast Excess Risk Rates via Offset Rademacher Complexity

Chenguang Duan, Yuling Jiao, Lican Kang, Xiliang Lu, Jerry Zhijian Yang
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:8697-8716, 2023.

Abstract

Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-duan23a, title = {Fast Excess Risk Rates via Offset Rademacher Complexity}, author = {Duan, Chenguang and Jiao, Yuling and Kang, Lican and Lu, Xiliang and Yang, Jerry Zhijian}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {8697--8716}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/duan23a/duan23a.pdf}, url = {https://proceedings.mlr.press/v202/duan23a.html}, abstract = {Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.} }
Endnote
%0 Conference Paper %T Fast Excess Risk Rates via Offset Rademacher Complexity %A Chenguang Duan %A Yuling Jiao %A Lican Kang %A Xiliang Lu %A Jerry Zhijian Yang %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-duan23a %I PMLR %P 8697--8716 %U https://proceedings.mlr.press/v202/duan23a.html %V 202 %X Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.
APA
Duan, C., Jiao, Y., Kang, L., Lu, X. & Yang, J.Z.. (2023). Fast Excess Risk Rates via Offset Rademacher Complexity. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:8697-8716 Available from https://proceedings.mlr.press/v202/duan23a.html.

Related Material