Stochastic Proximal Algorithms for AUC Maximization

Michael Natole, Yiming Ying, Siwei Lyu
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3710-3719, 2018.

Abstract

Stochastic optimization algorithms such as SGDs update the model sequentially with cheap per-iteration costs, making them amenable for large-scale data analysis. However, most of the existing studies focus on the classification accuracy which can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking. In this paper, we develop a novel stochastic proximal algorithm for AUC maximization which is referred to as SPAM. Compared with the previous literature, our algorithm SPAM applies to a non-smooth penalty function, and achieves a convergence rate of O(log t/t) for strongly convex functions while both space and per-iteration costs are of one datum.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-natole18a, title = {Stochastic Proximal Algorithms for {AUC} Maximization}, author = {Natole, Jr., Michael and Ying, Yiming and Lyu, Siwei}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3710--3719}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/natole18a/natole18a.pdf}, url = {https://proceedings.mlr.press/v80/natole18a.html}, abstract = {Stochastic optimization algorithms such as SGDs update the model sequentially with cheap per-iteration costs, making them amenable for large-scale data analysis. However, most of the existing studies focus on the classification accuracy which can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking. In this paper, we develop a novel stochastic proximal algorithm for AUC maximization which is referred to as SPAM. Compared with the previous literature, our algorithm SPAM applies to a non-smooth penalty function, and achieves a convergence rate of O(log t/t) for strongly convex functions while both space and per-iteration costs are of one datum.} }
Endnote
%0 Conference Paper %T Stochastic Proximal Algorithms for AUC Maximization %A Michael Natole %A Yiming Ying %A Siwei Lyu %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-natole18a %I PMLR %P 3710--3719 %U https://proceedings.mlr.press/v80/natole18a.html %V 80 %X Stochastic optimization algorithms such as SGDs update the model sequentially with cheap per-iteration costs, making them amenable for large-scale data analysis. However, most of the existing studies focus on the classification accuracy which can not be directly applied to the important problems of maximizing the Area under the ROC curve (AUC) in imbalanced classification and bipartite ranking. In this paper, we develop a novel stochastic proximal algorithm for AUC maximization which is referred to as SPAM. Compared with the previous literature, our algorithm SPAM applies to a non-smooth penalty function, and achieves a convergence rate of O(log t/t) for strongly convex functions while both space and per-iteration costs are of one datum.
APA
Natole, M., Ying, Y. & Lyu, S.. (2018). Stochastic Proximal Algorithms for AUC Maximization. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3710-3719 Available from https://proceedings.mlr.press/v80/natole18a.html.

Related Material