Sublinear quantum algorithms for training linear and kernelbased classifiers
[edit]
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:38153824, 2019.
Abstract
We investigate quantum algorithms for classification, a fundamental problem in machine learning, with provable guarantees. Given $n$ $d$dimensional data points, the stateoftheart (and optimal) classical algorithm for training classifiers with constant margin by Clarkson et al. runs in $\tilde{O}(n +d)$, which is also optimal in its input/output model. We design sublinear quantum algorithms for the same task running in $\tilde{O}(\sqrt{n} +\sqrt{d})$, a quadratic improvement in both $n$ and $d$. Moreover, our algorithms use the standard quantization of the classical input and generate the same classical output, suggesting minimal overheads when used as subroutines for endtoend applications. We also demonstrate a tight lower bound (up to polylog factors) and discuss the possibility of implementation on nearterm quantum machines.
Related Material


