Approximate Inference Using DC Programming For Collective Graphical Models

Thien Nguyen, Akshat Kumar, Hoong Chuin Lau, Daniel Sheldon
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:685-693, 2016.

Abstract

Collective graphical models (CGMs) provide a framework for reasoning about a population of independent and identically distributed individuals when only noisy and aggregate observations are given. Previous approaches for inference in CGMs work on a junction-tree representation, thereby highly limiting their scalability. To remedy this, we show how the Bethe entropy approximation naturally arises for the inference problem in CGMs. We reformulate the resulting optimization problem as a difference-of-convex functions program that can capture different types of CGM noise models. Using the concave-convex procedure, we then develop a scalable message-passing algorithm. Empirically, our approach is highly scalable and accurate for large graphs, more than an order-of-magnitude faster than a generic optimization solver, and is guaranteed to converge unlike the previous message-passing approach NLBP that fails in several loopy graphs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-nguyen16b, title = {Approximate Inference Using DC Programming For Collective Graphical Models}, author = {Nguyen, Thien and Kumar, Akshat and Lau, Hoong Chuin and Sheldon, Daniel}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {685--693}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/nguyen16b.pdf}, url = {https://proceedings.mlr.press/v51/nguyen16b.html}, abstract = {Collective graphical models (CGMs) provide a framework for reasoning about a population of independent and identically distributed individuals when only noisy and aggregate observations are given. Previous approaches for inference in CGMs work on a junction-tree representation, thereby highly limiting their scalability. To remedy this, we show how the Bethe entropy approximation naturally arises for the inference problem in CGMs. We reformulate the resulting optimization problem as a difference-of-convex functions program that can capture different types of CGM noise models. Using the concave-convex procedure, we then develop a scalable message-passing algorithm. Empirically, our approach is highly scalable and accurate for large graphs, more than an order-of-magnitude faster than a generic optimization solver, and is guaranteed to converge unlike the previous message-passing approach NLBP that fails in several loopy graphs.} }
Endnote
%0 Conference Paper %T Approximate Inference Using DC Programming For Collective Graphical Models %A Thien Nguyen %A Akshat Kumar %A Hoong Chuin Lau %A Daniel Sheldon %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-nguyen16b %I PMLR %P 685--693 %U https://proceedings.mlr.press/v51/nguyen16b.html %V 51 %X Collective graphical models (CGMs) provide a framework for reasoning about a population of independent and identically distributed individuals when only noisy and aggregate observations are given. Previous approaches for inference in CGMs work on a junction-tree representation, thereby highly limiting their scalability. To remedy this, we show how the Bethe entropy approximation naturally arises for the inference problem in CGMs. We reformulate the resulting optimization problem as a difference-of-convex functions program that can capture different types of CGM noise models. Using the concave-convex procedure, we then develop a scalable message-passing algorithm. Empirically, our approach is highly scalable and accurate for large graphs, more than an order-of-magnitude faster than a generic optimization solver, and is guaranteed to converge unlike the previous message-passing approach NLBP that fails in several loopy graphs.
RIS
TY - CPAPER TI - Approximate Inference Using DC Programming For Collective Graphical Models AU - Thien Nguyen AU - Akshat Kumar AU - Hoong Chuin Lau AU - Daniel Sheldon BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-nguyen16b PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 685 EP - 693 L1 - http://proceedings.mlr.press/v51/nguyen16b.pdf UR - https://proceedings.mlr.press/v51/nguyen16b.html AB - Collective graphical models (CGMs) provide a framework for reasoning about a population of independent and identically distributed individuals when only noisy and aggregate observations are given. Previous approaches for inference in CGMs work on a junction-tree representation, thereby highly limiting their scalability. To remedy this, we show how the Bethe entropy approximation naturally arises for the inference problem in CGMs. We reformulate the resulting optimization problem as a difference-of-convex functions program that can capture different types of CGM noise models. Using the concave-convex procedure, we then develop a scalable message-passing algorithm. Empirically, our approach is highly scalable and accurate for large graphs, more than an order-of-magnitude faster than a generic optimization solver, and is guaranteed to converge unlike the previous message-passing approach NLBP that fails in several loopy graphs. ER -
APA
Nguyen, T., Kumar, A., Lau, H.C. & Sheldon, D.. (2016). Approximate Inference Using DC Programming For Collective Graphical Models. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:685-693 Available from https://proceedings.mlr.press/v51/nguyen16b.html.

Related Material