Fast Rank-1 NMF for Missing Data with KL Divergence

Kazu Ghalamkari, Mahito Sugiyama
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:2927-2940, 2022.

Abstract

We propose a fast non-gradient-based method of rank-1 non-negative matrix factorization (NMF) for missing data, called A1GM, that minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix. Our method is based on our new finding of an analytical closed-formula of the best rank-1 non-negative multiple matrix factorization (NMMF), a variety of NMF. NMMF is known to exactly solve NMF for missing data if positions of missing values satisfy a certain condition, and A1GM transforms a given matrix so that the analytical solution to NMMF can be applied. We empirically show that A1GM is more efficient than a gradient method with competitive reconstruction errors.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-ghalamkari22a, title = { Fast Rank-1 NMF for Missing Data with KL Divergence }, author = {Ghalamkari, Kazu and Sugiyama, Mahito}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {2927--2940}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/ghalamkari22a/ghalamkari22a.pdf}, url = {https://proceedings.mlr.press/v151/ghalamkari22a.html}, abstract = { We propose a fast non-gradient-based method of rank-1 non-negative matrix factorization (NMF) for missing data, called A1GM, that minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix. Our method is based on our new finding of an analytical closed-formula of the best rank-1 non-negative multiple matrix factorization (NMMF), a variety of NMF. NMMF is known to exactly solve NMF for missing data if positions of missing values satisfy a certain condition, and A1GM transforms a given matrix so that the analytical solution to NMMF can be applied. We empirically show that A1GM is more efficient than a gradient method with competitive reconstruction errors. } }
Endnote
%0 Conference Paper %T Fast Rank-1 NMF for Missing Data with KL Divergence %A Kazu Ghalamkari %A Mahito Sugiyama %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-ghalamkari22a %I PMLR %P 2927--2940 %U https://proceedings.mlr.press/v151/ghalamkari22a.html %V 151 %X We propose a fast non-gradient-based method of rank-1 non-negative matrix factorization (NMF) for missing data, called A1GM, that minimizes the KL divergence from an input matrix to the reconstructed rank-1 matrix. Our method is based on our new finding of an analytical closed-formula of the best rank-1 non-negative multiple matrix factorization (NMMF), a variety of NMF. NMMF is known to exactly solve NMF for missing data if positions of missing values satisfy a certain condition, and A1GM transforms a given matrix so that the analytical solution to NMMF can be applied. We empirically show that A1GM is more efficient than a gradient method with competitive reconstruction errors.
APA
Ghalamkari, K. & Sugiyama, M.. (2022). Fast Rank-1 NMF for Missing Data with KL Divergence . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:2927-2940 Available from https://proceedings.mlr.press/v151/ghalamkari22a.html.

Related Material