Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case

Alina Beygelzimer, David Pal, Balazs Szorenyi, Devanathan Thiruvenkatachari, Chen-Yu Wei, Chicheng Zhang
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:624-633, 2019.

Abstract

We study the problem of efficient online multiclass linear classification with bandit feedback, where all examples belong to one of $K$ classes and lie in the $d$-dimensional Euclidean space. Previous works have left open the challenge of designing efficient algorithms with finite mistake bounds when the data is linearly separable by a margin $\gamma$. In this work, we take a first step towards this problem. We consider two notions of linear separability: strong and weak. 1. Under the strong linear separability condition, we design an efficient algorithm that achieves a near-optimal mistake bound of $O\left(\frac{K}{\gamma^2} \right)$. 2. Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of $2^{\widetilde{O}(\min(K \log^2 \frac{1}{\gamma}, \sqrt{\frac{1}{\gamma}} \log K))}$. Our algorithm is based on kernel Perceptron, which is inspired by the work of Klivans & Servedio (2008) on improperly learning intersection of halfspaces.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-beygelzimer19a, title = {Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case}, author = {Beygelzimer, Alina and Pal, David and Szorenyi, Balazs and Thiruvenkatachari, Devanathan and Wei, Chen-Yu and Zhang, Chicheng}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {624--633}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/beygelzimer19a/beygelzimer19a.pdf}, url = {https://proceedings.mlr.press/v97/beygelzimer19a.html}, abstract = {We study the problem of efficient online multiclass linear classification with bandit feedback, where all examples belong to one of $K$ classes and lie in the $d$-dimensional Euclidean space. Previous works have left open the challenge of designing efficient algorithms with finite mistake bounds when the data is linearly separable by a margin $\gamma$. In this work, we take a first step towards this problem. We consider two notions of linear separability: strong and weak. 1. Under the strong linear separability condition, we design an efficient algorithm that achieves a near-optimal mistake bound of $O\left(\frac{K}{\gamma^2} \right)$. 2. Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of $2^{\widetilde{O}(\min(K \log^2 \frac{1}{\gamma}, \sqrt{\frac{1}{\gamma}} \log K))}$. Our algorithm is based on kernel Perceptron, which is inspired by the work of Klivans & Servedio (2008) on improperly learning intersection of halfspaces.} }
Endnote
%0 Conference Paper %T Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case %A Alina Beygelzimer %A David Pal %A Balazs Szorenyi %A Devanathan Thiruvenkatachari %A Chen-Yu Wei %A Chicheng Zhang %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-beygelzimer19a %I PMLR %P 624--633 %U https://proceedings.mlr.press/v97/beygelzimer19a.html %V 97 %X We study the problem of efficient online multiclass linear classification with bandit feedback, where all examples belong to one of $K$ classes and lie in the $d$-dimensional Euclidean space. Previous works have left open the challenge of designing efficient algorithms with finite mistake bounds when the data is linearly separable by a margin $\gamma$. In this work, we take a first step towards this problem. We consider two notions of linear separability: strong and weak. 1. Under the strong linear separability condition, we design an efficient algorithm that achieves a near-optimal mistake bound of $O\left(\frac{K}{\gamma^2} \right)$. 2. Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of $2^{\widetilde{O}(\min(K \log^2 \frac{1}{\gamma}, \sqrt{\frac{1}{\gamma}} \log K))}$. Our algorithm is based on kernel Perceptron, which is inspired by the work of Klivans & Servedio (2008) on improperly learning intersection of halfspaces.
APA
Beygelzimer, A., Pal, D., Szorenyi, B., Thiruvenkatachari, D., Wei, C. & Zhang, C.. (2019). Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:624-633 Available from https://proceedings.mlr.press/v97/beygelzimer19a.html.

Related Material