Collective Stability in Structured Prediction: Generalization from One Example

Ben London, Bert Huang, Ben Taskar, Lise Getoor
Proceedings of the 30th International Conference on Machine Learning, PMLR 28(3):828-836, 2013.

Abstract

Structured predictors enable joint inference over multiple interdependent output variables. These models are often trained on a small number of examples with large internal structure. Existing distribution-free generalization bounds do not guarantee generalization in this setting, though this contradicts a large body of empirical evidence from computer vision, natural language processing, social networks and other fields. In this paper, we identify a set of natural conditions – weak dependence, hypothesis complexity and a new measure, collective stability – that are sufficient for generalization from even a single example, without imposing an explicit generative model of the data. We then demonstrate that the complexity and stability conditions are satisfied by a broad class of models, including marginal inference in templated graphical models. We thus obtain uniform convergence rates that can decrease significantly faster than previous bounds, particularly when each structured example is sufficiently large and the number of training examples is constant, even one.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-london13, title = {Collective Stability in Structured Prediction: Generalization from One Example}, author = {London, Ben and Huang, Bert and Taskar, Ben and Getoor, Lise}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {828--836}, year = {2013}, editor = {Dasgupta, Sanjoy and McAllester, David}, volume = {28}, number = {3}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/london13.pdf}, url = {https://proceedings.mlr.press/v28/london13.html}, abstract = {Structured predictors enable joint inference over multiple interdependent output variables. These models are often trained on a small number of examples with large internal structure. Existing distribution-free generalization bounds do not guarantee generalization in this setting, though this contradicts a large body of empirical evidence from computer vision, natural language processing, social networks and other fields. In this paper, we identify a set of natural conditions – weak dependence, hypothesis complexity and a new measure, collective stability – that are sufficient for generalization from even a single example, without imposing an explicit generative model of the data. We then demonstrate that the complexity and stability conditions are satisfied by a broad class of models, including marginal inference in templated graphical models. We thus obtain uniform convergence rates that can decrease significantly faster than previous bounds, particularly when each structured example is sufficiently large and the number of training examples is constant, even one.} }
Endnote
%0 Conference Paper %T Collective Stability in Structured Prediction: Generalization from One Example %A Ben London %A Bert Huang %A Ben Taskar %A Lise Getoor %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-london13 %I PMLR %P 828--836 %U https://proceedings.mlr.press/v28/london13.html %V 28 %N 3 %X Structured predictors enable joint inference over multiple interdependent output variables. These models are often trained on a small number of examples with large internal structure. Existing distribution-free generalization bounds do not guarantee generalization in this setting, though this contradicts a large body of empirical evidence from computer vision, natural language processing, social networks and other fields. In this paper, we identify a set of natural conditions – weak dependence, hypothesis complexity and a new measure, collective stability – that are sufficient for generalization from even a single example, without imposing an explicit generative model of the data. We then demonstrate that the complexity and stability conditions are satisfied by a broad class of models, including marginal inference in templated graphical models. We thus obtain uniform convergence rates that can decrease significantly faster than previous bounds, particularly when each structured example is sufficiently large and the number of training examples is constant, even one.
RIS
TY - CPAPER TI - Collective Stability in Structured Prediction: Generalization from One Example AU - Ben London AU - Bert Huang AU - Ben Taskar AU - Lise Getoor BT - Proceedings of the 30th International Conference on Machine Learning DA - 2013/05/26 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-london13 PB - PMLR DP - Proceedings of Machine Learning Research VL - 28 IS - 3 SP - 828 EP - 836 L1 - http://proceedings.mlr.press/v28/london13.pdf UR - https://proceedings.mlr.press/v28/london13.html AB - Structured predictors enable joint inference over multiple interdependent output variables. These models are often trained on a small number of examples with large internal structure. Existing distribution-free generalization bounds do not guarantee generalization in this setting, though this contradicts a large body of empirical evidence from computer vision, natural language processing, social networks and other fields. In this paper, we identify a set of natural conditions – weak dependence, hypothesis complexity and a new measure, collective stability – that are sufficient for generalization from even a single example, without imposing an explicit generative model of the data. We then demonstrate that the complexity and stability conditions are satisfied by a broad class of models, including marginal inference in templated graphical models. We thus obtain uniform convergence rates that can decrease significantly faster than previous bounds, particularly when each structured example is sufficiently large and the number of training examples is constant, even one. ER -
APA
London, B., Huang, B., Taskar, B. & Getoor, L.. (2013). Collective Stability in Structured Prediction: Generalization from One Example. Proceedings of the 30th International Conference on Machine Learning, in Proceedings of Machine Learning Research 28(3):828-836 Available from https://proceedings.mlr.press/v28/london13.html.

Related Material