Empirical Risk Minimization of Graphical Model Parameters Given Approximate Inference, Decoding, and Model Structure

[edit]

Veselin Stoyanov, Alexander Ropson, Jason Eisner ;
Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, PMLR 15:725-733, 2011.

Abstract

Graphical models are often used “inappropriately,” with approximations in the topology, inference, and prediction. Yet it is still common to train their parameters to approximately maximize training likelihood. We argue that instead, one should seek the parameters that minimize the empirical risk of the entire imperfect system. We show how to locally optimize this risk using back-propagation and stochastic meta-descent. Over a range of synthetic-data problems, compared to the usual practice of choosing approximate MAP parameters, our approach significantly reduces loss on test data, sometimes by an order of magnitude. [pdf][supplementary]

Related Material