Learning Classifiers with FenchelYoung Losses: Generalized Entropies, Margins, and Algorithms
[edit]
Proceedings of Machine Learning Research, PMLR 89:606615, 2019.
Abstract
This paper studies FenchelYoung losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many wellknown loss functions and allow to create useful new ones easily. FenchelYoung losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making FenchelYoung losses appealing both in theory and practice.
Related Material


