Online Nonnegative Matrix Factorization with General Divergences
Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, PMLR 54:37-45, 2017.
We develop a unified and systematic framework for performing online nonnegative matrix factorization under a wide variety of important divergences. The online nature of our algorithms makes them particularly amenable to large-scale data. We prove that the sequence of learned dictionaries converges almost surely to the set of critical points of the expected loss function. Experimental results demonstrate the computational efficiency and outstanding performances of our algorithms on several real-life applications, including topic modeling, document clustering and foreground-background separation.