Part & Clamp: Efficient Structured Output Learning

Patrick Pletscher, Cheng Soon Ong
; Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:877-885, 2012.

Abstract

Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-pletscher12a, title = {Part & Clamp: Efficient Structured Output Learning}, author = {Patrick Pletscher and Cheng Soon Ong}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {877--885}, year = {2012}, editor = {Neil D. Lawrence and Mark Girolami}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/pletscher12a/pletscher12a.pdf}, url = {http://proceedings.mlr.press/v22/pletscher12a.html}, abstract = {Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.} }
Endnote
%0 Conference Paper %T Part & Clamp: Efficient Structured Output Learning %A Patrick Pletscher %A Cheng Soon Ong %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-pletscher12a %I PMLR %J Proceedings of Machine Learning Research %P 877--885 %U http://proceedings.mlr.press %V 22 %W PMLR %X Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem.
RIS
TY - CPAPER TI - Part & Clamp: Efficient Structured Output Learning AU - Patrick Pletscher AU - Cheng Soon Ong BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics PY - 2012/03/21 DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-pletscher12a PB - PMLR SP - 877 DP - PMLR EP - 885 L1 - http://proceedings.mlr.press/v22/pletscher12a/pletscher12a.pdf UR - http://proceedings.mlr.press/v22/pletscher12a.html AB - Discriminative training for general graphical models is a challenging task, due to the intractability of the partition function. We propose a computationally efficient approach to estimate the partition sum in a structured learning problem. The key idea is a lower bound of the partition sum that can be evaluated in a fixed number of message passing iterations. The bound makes use of a subset of the variables, a feedback vertex set, which allows us to decompose the graph into tractable parts. Furthermore, a tightening strategy for the bound is presented, which finds the states of the feedback vertex set that maximally increase the bound, and clamps them. Based on this lower bound we derive batch and online learning algorithms and demonstrate their effectiveness on a computer vision problem. ER -
APA
Pletscher, P. & Ong, C.S.. (2012). Part & Clamp: Efficient Structured Output Learning. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in PMLR 22:877-885

Related Material