Sublinear quantum algorithms for training linear and kernel-based classifiers

Tongyang Li, Shouvanik Chakrabarti, Xiaodi Wu
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:3815-3824, 2019.

Abstract

We investigate quantum algorithms for classification, a fundamental problem in machine learning, with provable guarantees. Given $n$ $d$-dimensional data points, the state-of-the-art (and optimal) classical algorithm for training classifiers with constant margin by Clarkson et al. runs in $\tilde{O}(n +d)$, which is also optimal in its input/output model. We design sublinear quantum algorithms for the same task running in $\tilde{O}(\sqrt{n} +\sqrt{d})$, a quadratic improvement in both $n$ and $d$. Moreover, our algorithms use the standard quantization of the classical input and generate the same classical output, suggesting minimal overheads when used as subroutines for end-to-end applications. We also demonstrate a tight lower bound (up to poly-log factors) and discuss the possibility of implementation on near-term quantum machines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-li19b, title = {Sublinear quantum algorithms for training linear and kernel-based classifiers}, author = {Li, Tongyang and Chakrabarti, Shouvanik and Wu, Xiaodi}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {3815--3824}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/li19b/li19b.pdf}, url = {https://proceedings.mlr.press/v97/li19b.html}, abstract = {We investigate quantum algorithms for classification, a fundamental problem in machine learning, with provable guarantees. Given $n$ $d$-dimensional data points, the state-of-the-art (and optimal) classical algorithm for training classifiers with constant margin by Clarkson et al. runs in $\tilde{O}(n +d)$, which is also optimal in its input/output model. We design sublinear quantum algorithms for the same task running in $\tilde{O}(\sqrt{n} +\sqrt{d})$, a quadratic improvement in both $n$ and $d$. Moreover, our algorithms use the standard quantization of the classical input and generate the same classical output, suggesting minimal overheads when used as subroutines for end-to-end applications. We also demonstrate a tight lower bound (up to poly-log factors) and discuss the possibility of implementation on near-term quantum machines.} }
Endnote
%0 Conference Paper %T Sublinear quantum algorithms for training linear and kernel-based classifiers %A Tongyang Li %A Shouvanik Chakrabarti %A Xiaodi Wu %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-li19b %I PMLR %P 3815--3824 %U https://proceedings.mlr.press/v97/li19b.html %V 97 %X We investigate quantum algorithms for classification, a fundamental problem in machine learning, with provable guarantees. Given $n$ $d$-dimensional data points, the state-of-the-art (and optimal) classical algorithm for training classifiers with constant margin by Clarkson et al. runs in $\tilde{O}(n +d)$, which is also optimal in its input/output model. We design sublinear quantum algorithms for the same task running in $\tilde{O}(\sqrt{n} +\sqrt{d})$, a quadratic improvement in both $n$ and $d$. Moreover, our algorithms use the standard quantization of the classical input and generate the same classical output, suggesting minimal overheads when used as subroutines for end-to-end applications. We also demonstrate a tight lower bound (up to poly-log factors) and discuss the possibility of implementation on near-term quantum machines.
APA
Li, T., Chakrabarti, S. & Wu, X.. (2019). Sublinear quantum algorithms for training linear and kernel-based classifiers. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:3815-3824 Available from https://proceedings.mlr.press/v97/li19b.html.

Related Material