Partial Trace Regression and Low-Rank Kraus Decomposition

Hachem Kadri, Stephane Ayache, Riikka Huusari, Alain Rakotomamonjy, Ralaivola Liva
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5031-5041, 2020.

Abstract

The trace regression model, a direct extension of the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce an even more general model, namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-kadri20a, title = {Partial Trace Regression and Low-Rank Kraus Decomposition}, author = {Kadri, Hachem and Ayache, Stephane and Huusari, Riikka and Rakotomamonjy, Alain and Liva, Ralaivola}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {5031--5041}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/kadri20a/kadri20a.pdf}, url = {https://proceedings.mlr.press/v119/kadri20a.html}, abstract = {The trace regression model, a direct extension of the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce an even more general model, namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.} }
Endnote
%0 Conference Paper %T Partial Trace Regression and Low-Rank Kraus Decomposition %A Hachem Kadri %A Stephane Ayache %A Riikka Huusari %A Alain Rakotomamonjy %A Ralaivola Liva %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-kadri20a %I PMLR %P 5031--5041 %U https://proceedings.mlr.press/v119/kadri20a.html %V 119 %X The trace regression model, a direct extension of the well-studied linear regression model, allows one to map matrices to real-valued outputs. We here introduce an even more general model, namely the partial-trace regression model, a family of linear mappings from matrix-valued inputs to matrix-valued outputs; this model subsumes the trace regression model and thus the linear regression model. Borrowing tools from quantum information theory, where partial trace operators have been extensively studied, we propose a framework for learning partial trace regression models from data by taking advantage of the so-called low-rank Kraus representation of completely positive maps. We show the relevance of our framework with synthetic and real-world experiments conducted for both i) matrix-to-matrix regression and ii) positive semidefinite matrix completion, two tasks which can be formulated as partial trace regression problems.
APA
Kadri, H., Ayache, S., Huusari, R., Rakotomamonjy, A. & Liva, R.. (2020). Partial Trace Regression and Low-Rank Kraus Decomposition. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:5031-5041 Available from https://proceedings.mlr.press/v119/kadri20a.html.

Related Material