Fast Deterministic CUR Matrix Decomposition with Accuracy Assurance

Yasutoshi Ida, Sekitoshi Kanai, Yasuhiro Fujiwara, Tomoharu Iwata, Koh Takeuchi, Hisashi Kashima
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4594-4603, 2020.

Abstract

The deterministic CUR matrix decomposition is a low-rank approximation method to analyze a data matrix. It has attracted considerable attention due to its high interpretability, which results from the fact that the decomposed matrices consist of subsets of the original columns and rows of the data matrix. The subset is obtained by optimizing an objective function with sparsity-inducing norms via coordinate descent. However, the existing algorithms for optimization incur high computation costs. This is because coordinate descent iteratively updates all the parameters in the objective until convergence. This paper proposes a fast deterministic CUR matrix decomposition. Our algorithm safely skips unnecessary updates by efficiently evaluating the optimality conditions for the parameters to be zeros. In addition, we preferentially update the parameters that must be nonzeros. Theoretically, our approach guarantees the same result as the original approach. Experiments demonstrate that our algorithm speeds up the deterministic CUR while achieving the same accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-ida20a, title = {Fast Deterministic {CUR} Matrix Decomposition with Accuracy Assurance}, author = {Ida, Yasutoshi and Kanai, Sekitoshi and Fujiwara, Yasuhiro and Iwata, Tomoharu and Takeuchi, Koh and Kashima, Hisashi}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4594--4603}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/ida20a/ida20a.pdf}, url = {https://proceedings.mlr.press/v119/ida20a.html}, abstract = {The deterministic CUR matrix decomposition is a low-rank approximation method to analyze a data matrix. It has attracted considerable attention due to its high interpretability, which results from the fact that the decomposed matrices consist of subsets of the original columns and rows of the data matrix. The subset is obtained by optimizing an objective function with sparsity-inducing norms via coordinate descent. However, the existing algorithms for optimization incur high computation costs. This is because coordinate descent iteratively updates all the parameters in the objective until convergence. This paper proposes a fast deterministic CUR matrix decomposition. Our algorithm safely skips unnecessary updates by efficiently evaluating the optimality conditions for the parameters to be zeros. In addition, we preferentially update the parameters that must be nonzeros. Theoretically, our approach guarantees the same result as the original approach. Experiments demonstrate that our algorithm speeds up the deterministic CUR while achieving the same accuracy.} }
Endnote
%0 Conference Paper %T Fast Deterministic CUR Matrix Decomposition with Accuracy Assurance %A Yasutoshi Ida %A Sekitoshi Kanai %A Yasuhiro Fujiwara %A Tomoharu Iwata %A Koh Takeuchi %A Hisashi Kashima %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-ida20a %I PMLR %P 4594--4603 %U https://proceedings.mlr.press/v119/ida20a.html %V 119 %X The deterministic CUR matrix decomposition is a low-rank approximation method to analyze a data matrix. It has attracted considerable attention due to its high interpretability, which results from the fact that the decomposed matrices consist of subsets of the original columns and rows of the data matrix. The subset is obtained by optimizing an objective function with sparsity-inducing norms via coordinate descent. However, the existing algorithms for optimization incur high computation costs. This is because coordinate descent iteratively updates all the parameters in the objective until convergence. This paper proposes a fast deterministic CUR matrix decomposition. Our algorithm safely skips unnecessary updates by efficiently evaluating the optimality conditions for the parameters to be zeros. In addition, we preferentially update the parameters that must be nonzeros. Theoretically, our approach guarantees the same result as the original approach. Experiments demonstrate that our algorithm speeds up the deterministic CUR while achieving the same accuracy.
APA
Ida, Y., Kanai, S., Fujiwara, Y., Iwata, T., Takeuchi, K. & Kashima, H.. (2020). Fast Deterministic CUR Matrix Decomposition with Accuracy Assurance. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4594-4603 Available from https://proceedings.mlr.press/v119/ida20a.html.

Related Material