[edit]
Fast Excess Risk Rates via Offset Rademacher Complexity
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:8697-8716, 2023.
Abstract
Based on the offset Rademacher complexity, this work outlines a systematical framework for deriving sharp excess risk bounds in statistical learning without Bernstein condition. In addition to recovering fast rates in a unified way for some parametric and nonparametric supervised learning models with minimum identifiability assumptions, we also obtain new and improved results for LAD (sparse) linear regression and deep logistic regression with deep ReLU neural networks, respectively.