Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models

Matthias Seeger, Guillaume Bouchard
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:1012-1018, 2012.

Abstract

Probabilistic matrix factorization methods aim to extract meaningful correlation structure from an incomplete data matrix by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to such large scale bilinear models. However, current algorithms are of the alternate updating or stochastic gradient descent type, slow to converge and prone to getting stuck in shallow local minima. While for MAP or maximum margin estimation, singular value shrinkage algorithms have been proposed which can far outperform alternate updating, this methodological avenue remains unexplored for Bayesian techniques. In this paper, we show how to combine a recent singular value shrinkage characterization of fully observed spherical Gaussian VB matrix factorization with augmented Lagrangian techniques in order to obtain efficient VB inference for general MF models with arbitrary likelihood potentials. In particular, we show how to handle Poisson and Bernoulli potentials, far more suited for most MF applications than Gaussian likelihoods. Our algorithm can be run even for very large models and is easily implemented in \em Matlab. It outperforms MAP estimation on a range of real-world datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-seeger12, title = {Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models}, author = {Seeger, Matthias and Bouchard, Guillaume}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {1012--1018}, year = {2012}, editor = {Lawrence, Neil D. and Girolami, Mark}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/seeger12/seeger12.pdf}, url = {https://proceedings.mlr.press/v22/seeger12.html}, abstract = {Probabilistic matrix factorization methods aim to extract meaningful correlation structure from an incomplete data matrix by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to such large scale bilinear models. However, current algorithms are of the alternate updating or stochastic gradient descent type, slow to converge and prone to getting stuck in shallow local minima. While for MAP or maximum margin estimation, singular value shrinkage algorithms have been proposed which can far outperform alternate updating, this methodological avenue remains unexplored for Bayesian techniques. In this paper, we show how to combine a recent singular value shrinkage characterization of fully observed spherical Gaussian VB matrix factorization with augmented Lagrangian techniques in order to obtain efficient VB inference for general MF models with arbitrary likelihood potentials. In particular, we show how to handle Poisson and Bernoulli potentials, far more suited for most MF applications than Gaussian likelihoods. Our algorithm can be run even for very large models and is easily implemented in \em Matlab. It outperforms MAP estimation on a range of real-world datasets.} }
Endnote
%0 Conference Paper %T Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models %A Matthias Seeger %A Guillaume Bouchard %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-seeger12 %I PMLR %P 1012--1018 %U https://proceedings.mlr.press/v22/seeger12.html %V 22 %X Probabilistic matrix factorization methods aim to extract meaningful correlation structure from an incomplete data matrix by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to such large scale bilinear models. However, current algorithms are of the alternate updating or stochastic gradient descent type, slow to converge and prone to getting stuck in shallow local minima. While for MAP or maximum margin estimation, singular value shrinkage algorithms have been proposed which can far outperform alternate updating, this methodological avenue remains unexplored for Bayesian techniques. In this paper, we show how to combine a recent singular value shrinkage characterization of fully observed spherical Gaussian VB matrix factorization with augmented Lagrangian techniques in order to obtain efficient VB inference for general MF models with arbitrary likelihood potentials. In particular, we show how to handle Poisson and Bernoulli potentials, far more suited for most MF applications than Gaussian likelihoods. Our algorithm can be run even for very large models and is easily implemented in \em Matlab. It outperforms MAP estimation on a range of real-world datasets.
RIS
TY - CPAPER TI - Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models AU - Matthias Seeger AU - Guillaume Bouchard BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-seeger12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 22 SP - 1012 EP - 1018 L1 - http://proceedings.mlr.press/v22/seeger12/seeger12.pdf UR - https://proceedings.mlr.press/v22/seeger12.html AB - Probabilistic matrix factorization methods aim to extract meaningful correlation structure from an incomplete data matrix by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to such large scale bilinear models. However, current algorithms are of the alternate updating or stochastic gradient descent type, slow to converge and prone to getting stuck in shallow local minima. While for MAP or maximum margin estimation, singular value shrinkage algorithms have been proposed which can far outperform alternate updating, this methodological avenue remains unexplored for Bayesian techniques. In this paper, we show how to combine a recent singular value shrinkage characterization of fully observed spherical Gaussian VB matrix factorization with augmented Lagrangian techniques in order to obtain efficient VB inference for general MF models with arbitrary likelihood potentials. In particular, we show how to handle Poisson and Bernoulli potentials, far more suited for most MF applications than Gaussian likelihoods. Our algorithm can be run even for very large models and is easily implemented in \em Matlab. It outperforms MAP estimation on a range of real-world datasets. ER -
APA
Seeger, M. & Bouchard, G.. (2012). Fast Variational Bayesian Inference for Non-Conjugate Matrix Factorization Models. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 22:1012-1018 Available from https://proceedings.mlr.press/v22/seeger12.html.

Related Material