Multiclass learning with margin: exponential rates with no bias-variance trade-off
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:22260-22269, 2022.
We study the behavior of error bounds for multiclass classification under suitable margin conditions. For a wide variety of methods we prove that the classification error under a hard-margin condition decreases exponentially fast without any bias-variance trade-off. Different convergence rates can be obtained in correspondence of different margin assumptions. With a self-contained and instructive analysis we are able to generalize known results from the binary to the multiclass setting.