Infinite Positive Semidefinite Tensor Factorization for Source Separation of Mixture Signals

Kazuyoshi Yoshii, Ryota Tomioka, Daichi Mochihashi, Masataka Goto
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):576-584, 2013.

Abstract

This paper presents a new class of tensor factorization called positive semidefinite tensor factorization (PSDTF) that decomposes a set of positive semidefinite (PSD) matrices into the convex combinations of fewer PSD basis matrices. PSDTF can be viewed as a natural extension of nonnegative matrix factorization. One of the main problems of PSDTF is that an appropriate number of bases should be given in advance. To solve this problem, we propose a nonparametric Bayesian model based on a gamma process that can instantiate only a limited number of necessary bases from the infinitely many bases assumed to exist. We derive a variational Bayesian algorithm for closed-form posterior inference and a multiplicative update rule for maximum-likelihood estimation. We evaluated PSDTF on both synthetic data and real music recordings to show its superiority.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-yoshii13, title = {Infinite Positive Semidefinite Tensor Factorization for Source Separation of Mixture Signals}, author = {Yoshii, Kazuyoshi and Tomioka, Ryota and Mochihashi, Daichi and Goto, Masataka}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {576--584}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/yoshii13.pdf}, url = {https://proceedings.mlr.press/v28/yoshii13.html}, abstract = {This paper presents a new class of tensor factorization called positive semidefinite tensor factorization (PSDTF) that decomposes a set of positive semidefinite (PSD) matrices into the convex combinations of fewer PSD basis matrices. PSDTF can be viewed as a natural extension of nonnegative matrix factorization. One of the main problems of PSDTF is that an appropriate number of bases should be given in advance. To solve this problem, we propose a nonparametric Bayesian model based on a gamma process that can instantiate only a limited number of necessary bases from the infinitely many bases assumed to exist. We derive a variational Bayesian algorithm for closed-form posterior inference and a multiplicative update rule for maximum-likelihood estimation. We evaluated PSDTF on both synthetic data and real music recordings to show its superiority.} }
Endnote
%0 Conference Paper %T Infinite Positive Semidefinite Tensor Factorization for Source Separation of Mixture Signals %A Kazuyoshi Yoshii %A Ryota Tomioka %A Daichi Mochihashi %A Masataka Goto %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-yoshii13 %I PMLR %P 576--584 %U https://proceedings.mlr.press/v28/yoshii13.html %V 28 %N 3 %X This paper presents a new class of tensor factorization called positive semidefinite tensor factorization (PSDTF) that decomposes a set of positive semidefinite (PSD) matrices into the convex combinations of fewer PSD basis matrices. PSDTF can be viewed as a natural extension of nonnegative matrix factorization. One of the main problems of PSDTF is that an appropriate number of bases should be given in advance. To solve this problem, we propose a nonparametric Bayesian model based on a gamma process that can instantiate only a limited number of necessary bases from the infinitely many bases assumed to exist. We derive a variational Bayesian algorithm for closed-form posterior inference and a multiplicative update rule for maximum-likelihood estimation. We evaluated PSDTF on both synthetic data and real music recordings to show its superiority.
RIS
TY - CPAPER TI - Infinite Positive Semidefinite Tensor Factorization for Source Separation of Mixture Signals AU - Kazuyoshi Yoshii AU - Ryota Tomioka AU - Daichi Mochihashi AU - Masataka Goto BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-yoshii13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 576 EP - 584 L1 - http://proceedings.mlr.press/v28/yoshii13.pdf UR - https://proceedings.mlr.press/v28/yoshii13.html AB - This paper presents a new class of tensor factorization called positive semidefinite tensor factorization (PSDTF) that decomposes a set of positive semidefinite (PSD) matrices into the convex combinations of fewer PSD basis matrices. PSDTF can be viewed as a natural extension of nonnegative matrix factorization. One of the main problems of PSDTF is that an appropriate number of bases should be given in advance. To solve this problem, we propose a nonparametric Bayesian model based on a gamma process that can instantiate only a limited number of necessary bases from the infinitely many bases assumed to exist. We derive a variational Bayesian algorithm for closed-form posterior inference and a multiplicative update rule for maximum-likelihood estimation. We evaluated PSDTF on both synthetic data and real music recordings to show its superiority. ER -
APA
Yoshii, K., Tomioka, R., Mochihashi, D. & Goto, M.. (2013). Infinite Positive Semidefinite Tensor Factorization for Source Separation of Mixture Signals. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):576-584 Available from https://proceedings.mlr.press/v28/yoshii13.html.

Related Material