Efficient Methods for Online Multiclass Logistic Regression

Naman Agarwal, Satyen Kale, Julian Zimmert
Proceedings of The 33rd International Conference on Algorithmic Learning Theory, PMLR 167:3-33, 2022.

Abstract

Multiclass logistic regression is a fundamental task in machine learning with applications in classification and boosting. Previous work (Foster et al., 2018) has highlighted the importance of improper predictors for achieving “fast rates” in the online multiclass logistic regression problem without suffering exponentially from secondary problem parameters, such as the norm of the predictors in the comparison class. While Foster et al. (2018) introduced a statistically optimal algorithm, it is in practice computationally intractable due to its run-time complexity being a large polynomial in the time horizon and dimension of input feature vectors. In this paper, we develop a new algorithm, FOLKLORE, for the problem which runs significantly faster than the algorithm of Foster et al. (2018)–the running time per iteration scales quadratically in the dimension–at the cost of a linear dependence on the norm of the predictors in the regret bound. This yields the first practical algorithm for online multiclass logistic regression, resolving an open problem of Foster et al. (2018). Furthermore, we show that our algorithm can be applied to online bandit multiclass prediction and online multiclass boosting, yielding more practical algorithms for both problems compared to the ones in Foster et al. (2018) with similar performance guarantees. Finally, we also provide an online-to-batch conversion result for our algorithm.

Cite this Paper


BibTeX
@InProceedings{pmlr-v167-agarwal22a, title = {Efficient Methods for Online Multiclass Logistic Regression}, author = {Agarwal, Naman and Kale, Satyen and Zimmert, Julian}, booktitle = {Proceedings of The 33rd International Conference on Algorithmic Learning Theory}, pages = {3--33}, year = {2022}, editor = {Dasgupta, Sanjoy and Haghtalab, Nika}, volume = {167}, series = {Proceedings of Machine Learning Research}, month = {29 Mar--01 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v167/agarwal22a/agarwal22a.pdf}, url = {https://proceedings.mlr.press/v167/agarwal22a.html}, abstract = {Multiclass logistic regression is a fundamental task in machine learning with applications in classification and boosting. Previous work (Foster et al., 2018) has highlighted the importance of improper predictors for achieving “fast rates” in the online multiclass logistic regression problem without suffering exponentially from secondary problem parameters, such as the norm of the predictors in the comparison class. While Foster et al. (2018) introduced a statistically optimal algorithm, it is in practice computationally intractable due to its run-time complexity being a large polynomial in the time horizon and dimension of input feature vectors. In this paper, we develop a new algorithm, FOLKLORE, for the problem which runs significantly faster than the algorithm of Foster et al. (2018)–the running time per iteration scales quadratically in the dimension–at the cost of a linear dependence on the norm of the predictors in the regret bound. This yields the first practical algorithm for online multiclass logistic regression, resolving an open problem of Foster et al. (2018). Furthermore, we show that our algorithm can be applied to online bandit multiclass prediction and online multiclass boosting, yielding more practical algorithms for both problems compared to the ones in Foster et al. (2018) with similar performance guarantees. Finally, we also provide an online-to-batch conversion result for our algorithm.} }
Endnote
%0 Conference Paper %T Efficient Methods for Online Multiclass Logistic Regression %A Naman Agarwal %A Satyen Kale %A Julian Zimmert %B Proceedings of The 33rd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Sanjoy Dasgupta %E Nika Haghtalab %F pmlr-v167-agarwal22a %I PMLR %P 3--33 %U https://proceedings.mlr.press/v167/agarwal22a.html %V 167 %X Multiclass logistic regression is a fundamental task in machine learning with applications in classification and boosting. Previous work (Foster et al., 2018) has highlighted the importance of improper predictors for achieving “fast rates” in the online multiclass logistic regression problem without suffering exponentially from secondary problem parameters, such as the norm of the predictors in the comparison class. While Foster et al. (2018) introduced a statistically optimal algorithm, it is in practice computationally intractable due to its run-time complexity being a large polynomial in the time horizon and dimension of input feature vectors. In this paper, we develop a new algorithm, FOLKLORE, for the problem which runs significantly faster than the algorithm of Foster et al. (2018)–the running time per iteration scales quadratically in the dimension–at the cost of a linear dependence on the norm of the predictors in the regret bound. This yields the first practical algorithm for online multiclass logistic regression, resolving an open problem of Foster et al. (2018). Furthermore, we show that our algorithm can be applied to online bandit multiclass prediction and online multiclass boosting, yielding more practical algorithms for both problems compared to the ones in Foster et al. (2018) with similar performance guarantees. Finally, we also provide an online-to-batch conversion result for our algorithm.
APA
Agarwal, N., Kale, S. & Zimmert, J.. (2022). Efficient Methods for Online Multiclass Logistic Regression. Proceedings of The 33rd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 167:3-33 Available from https://proceedings.mlr.press/v167/agarwal22a.html.

Related Material