Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms

Mathieu Blondel, Andre Martins, Vlad Niculae
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:606-615, 2019.

Abstract

This paper studies Fenchel-Young losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many well-known loss functions and allow to create useful new ones easily. Fenchel-Young losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making Fenchel-Young losses appealing both in theory and practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-blondel19a, title = {Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms}, author = {Blondel, Mathieu and Martins, Andre and Niculae, Vlad}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {606--615}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/blondel19a/blondel19a.pdf}, url = {http://proceedings.mlr.press/v89/blondel19a.html}, abstract = {This paper studies Fenchel-Young losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many well-known loss functions and allow to create useful new ones easily. Fenchel-Young losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making Fenchel-Young losses appealing both in theory and practice.} }
Endnote
%0 Conference Paper %T Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms %A Mathieu Blondel %A Andre Martins %A Vlad Niculae %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-blondel19a %I PMLR %P 606--615 %U http://proceedings.mlr.press/v89/blondel19a.html %V 89 %X This paper studies Fenchel-Young losses, a generic way to construct convex loss functions from a regularization function. We analyze their properties in depth, showing that they unify many well-known loss functions and allow to create useful new ones easily. Fenchel-Young losses constructed from a generalized entropy, including the Shannon and Tsallis entropies, induce predictive probability distributions. We formulate conditions for a generalized entropy to yield losses with a separation margin, and probability distributions with sparse support. Finally, we derive efficient algorithms, making Fenchel-Young losses appealing both in theory and practice.
APA
Blondel, M., Martins, A. & Niculae, V.. (2019). Learning Classifiers with Fenchel-Young Losses: Generalized Entropies, Margins, and Algorithms. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:606-615 Available from http://proceedings.mlr.press/v89/blondel19a.html.

Related Material