Scalable Model Selection for Large-Scale Factorial Relational Models

Chunchen Liu, Lu Feng, Ryohei Fujimaki, Yusuke Muraoka
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:1227-1235, 2015.

Abstract

With a growing need to understand large-scale networks, factorial relational models, such as binary matrix factorization models (BMFs), have become important in many applications. Although BMFs have a natural capability to uncover overlapping group structures behind network data, existing inference techniques have issues of either high computational cost or lack of model selection capability, and this limits their applicability. For scalable model selection of BMFs, this paper proposes stochastic factorized asymptotic Bayesian (sFAB) inference that combines concepts in two recently-developed techniques: stochastic variational inference (SVI) and FAB inference. sFAB is a highly-efficient algorithm, having both scalability and an inherent model selection capability in a single inference framework. Empirical results show the superiority of sFAB/BMF in both accuracy and scalability over state-of-the-art inference methods for overlapping relational models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-liub15, title = {Scalable Model Selection for Large-Scale Factorial Relational Models}, author = {Liu, Chunchen and Feng, Lu and Fujimaki, Ryohei and Muraoka, Yusuke}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {1227--1235}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/liub15.pdf}, url = {https://proceedings.mlr.press/v37/liub15.html}, abstract = {With a growing need to understand large-scale networks, factorial relational models, such as binary matrix factorization models (BMFs), have become important in many applications. Although BMFs have a natural capability to uncover overlapping group structures behind network data, existing inference techniques have issues of either high computational cost or lack of model selection capability, and this limits their applicability. For scalable model selection of BMFs, this paper proposes stochastic factorized asymptotic Bayesian (sFAB) inference that combines concepts in two recently-developed techniques: stochastic variational inference (SVI) and FAB inference. sFAB is a highly-efficient algorithm, having both scalability and an inherent model selection capability in a single inference framework. Empirical results show the superiority of sFAB/BMF in both accuracy and scalability over state-of-the-art inference methods for overlapping relational models.} }
Endnote
%0 Conference Paper %T Scalable Model Selection for Large-Scale Factorial Relational Models %A Chunchen Liu %A Lu Feng %A Ryohei Fujimaki %A Yusuke Muraoka %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-liub15 %I PMLR %P 1227--1235 %U https://proceedings.mlr.press/v37/liub15.html %V 37 %X With a growing need to understand large-scale networks, factorial relational models, such as binary matrix factorization models (BMFs), have become important in many applications. Although BMFs have a natural capability to uncover overlapping group structures behind network data, existing inference techniques have issues of either high computational cost or lack of model selection capability, and this limits their applicability. For scalable model selection of BMFs, this paper proposes stochastic factorized asymptotic Bayesian (sFAB) inference that combines concepts in two recently-developed techniques: stochastic variational inference (SVI) and FAB inference. sFAB is a highly-efficient algorithm, having both scalability and an inherent model selection capability in a single inference framework. Empirical results show the superiority of sFAB/BMF in both accuracy and scalability over state-of-the-art inference methods for overlapping relational models.
RIS
TY - CPAPER TI - Scalable Model Selection for Large-Scale Factorial Relational Models AU - Chunchen Liu AU - Lu Feng AU - Ryohei Fujimaki AU - Yusuke Muraoka BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-liub15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 1227 EP - 1235 L1 - http://proceedings.mlr.press/v37/liub15.pdf UR - https://proceedings.mlr.press/v37/liub15.html AB - With a growing need to understand large-scale networks, factorial relational models, such as binary matrix factorization models (BMFs), have become important in many applications. Although BMFs have a natural capability to uncover overlapping group structures behind network data, existing inference techniques have issues of either high computational cost or lack of model selection capability, and this limits their applicability. For scalable model selection of BMFs, this paper proposes stochastic factorized asymptotic Bayesian (sFAB) inference that combines concepts in two recently-developed techniques: stochastic variational inference (SVI) and FAB inference. sFAB is a highly-efficient algorithm, having both scalability and an inherent model selection capability in a single inference framework. Empirical results show the superiority of sFAB/BMF in both accuracy and scalability over state-of-the-art inference methods for overlapping relational models. ER -
APA
Liu, C., Feng, L., Fujimaki, R. & Muraoka, Y.. (2015). Scalable Model Selection for Large-Scale Factorial Relational Models. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:1227-1235 Available from https://proceedings.mlr.press/v37/liub15.html.

Related Material