Geometric Losses for Distributional Learning

Arthur Mensch, Mathieu Blondel, Gabriel Peyré
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4516-4525, 2019.

Abstract

Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions, we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and showcase its effectiveness on two applications: ordinal regression and drawing generation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-mensch19a, title = {Geometric Losses for Distributional Learning}, author = {Mensch, Arthur and Blondel, Mathieu and Peyr{\'e}, Gabriel}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4516--4525}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/mensch19a/mensch19a.pdf}, url = {https://proceedings.mlr.press/v97/mensch19a.html}, abstract = {Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions, we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and showcase its effectiveness on two applications: ordinal regression and drawing generation.} }
Endnote
%0 Conference Paper %T Geometric Losses for Distributional Learning %A Arthur Mensch %A Mathieu Blondel %A Gabriel Peyré %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-mensch19a %I PMLR %P 4516--4525 %U https://proceedings.mlr.press/v97/mensch19a.html %V 97 %X Building upon recent advances in entropy-regularized optimal transport, and upon Fenchel duality between measures and continuous functions, we propose a generalization of the logistic loss that incorporates a metric or cost between classes. Unlike previous attempts to use optimal transport distances for learning, our loss results in unconstrained convex objective functions, supports infinite (or very large) class spaces, and naturally defines a geometric generalization of the softmax operator. The geometric properties of this loss make it suitable for predicting sparse and singular distributions, for instance supported on curves or hyper-surfaces. We study the theoretical properties of our loss and showcase its effectiveness on two applications: ordinal regression and drawing generation.
APA
Mensch, A., Blondel, M. & Peyré, G.. (2019). Geometric Losses for Distributional Learning. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4516-4525 Available from https://proceedings.mlr.press/v97/mensch19a.html.

Related Material