Quantum Expectation-Maximization for Gaussian mixture models

Iordanis Kerenidis, Alessandro Luongo, Anupam Prakash
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5187-5197, 2020.

Abstract

We define a quantum version of Expectation-Maximization (QEM), a fundamental tool in unsupervised machine learning, often used to solve Maximum Likelihood (ML) and Maximum A Posteriori (MAP) estimation problems. We use QEM to fit a Gaussian Mixture Model, and show how to generalize it to fit mixture models with base distributions in the exponential family. Given quantum access to a dataset, our algorithm has convergence and precision guarantees similar to the classical algorithm, while the runtime is polylogarithmic in the number of elements in the training set and polynomial in other parameters, such as the dimension of the feature space and the number of components in the mixture. We discuss the performance of the algorithm on a dataset that is expected to be classified successfully by classical EM and provide guarantees for its runtime.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-kerenidis20a, title = {Quantum Expectation-Maximization for {G}aussian mixture models}, author = {Kerenidis, Iordanis and Luongo, Alessandro and Prakash, Anupam}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {5187--5197}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/kerenidis20a/kerenidis20a.pdf}, url = { http://proceedings.mlr.press/v119/kerenidis20a.html }, abstract = {We define a quantum version of Expectation-Maximization (QEM), a fundamental tool in unsupervised machine learning, often used to solve Maximum Likelihood (ML) and Maximum A Posteriori (MAP) estimation problems. We use QEM to fit a Gaussian Mixture Model, and show how to generalize it to fit mixture models with base distributions in the exponential family. Given quantum access to a dataset, our algorithm has convergence and precision guarantees similar to the classical algorithm, while the runtime is polylogarithmic in the number of elements in the training set and polynomial in other parameters, such as the dimension of the feature space and the number of components in the mixture. We discuss the performance of the algorithm on a dataset that is expected to be classified successfully by classical EM and provide guarantees for its runtime.} }
Endnote
%0 Conference Paper %T Quantum Expectation-Maximization for Gaussian mixture models %A Iordanis Kerenidis %A Alessandro Luongo %A Anupam Prakash %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-kerenidis20a %I PMLR %P 5187--5197 %U http://proceedings.mlr.press/v119/kerenidis20a.html %V 119 %X We define a quantum version of Expectation-Maximization (QEM), a fundamental tool in unsupervised machine learning, often used to solve Maximum Likelihood (ML) and Maximum A Posteriori (MAP) estimation problems. We use QEM to fit a Gaussian Mixture Model, and show how to generalize it to fit mixture models with base distributions in the exponential family. Given quantum access to a dataset, our algorithm has convergence and precision guarantees similar to the classical algorithm, while the runtime is polylogarithmic in the number of elements in the training set and polynomial in other parameters, such as the dimension of the feature space and the number of components in the mixture. We discuss the performance of the algorithm on a dataset that is expected to be classified successfully by classical EM and provide guarantees for its runtime.
APA
Kerenidis, I., Luongo, A. & Prakash, A.. (2020). Quantum Expectation-Maximization for Gaussian mixture models. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:5187-5197 Available from http://proceedings.mlr.press/v119/kerenidis20a.html .

Related Material