Efficiently Enforcing Diversity in Multi-Output Structured Prediction

[edit]

Abner Guzman-Rivera, Pushmeet Kohli, Dhruv Batra, Rob Rutenbar ;
Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:284-292, 2014.

Abstract

This paper proposes a novel method for efficiently generating multiple diverse predictions for structured prediction problems. Existing methods like SDPPs or DivMBest work by making a series of predictions where each prediction is made after considering the predictions that came before it. Such approaches are inherently sequential and computationally expensive. In contrast, our method, Diverse Multiple Choice Learning, learns a set of models to make multiple independent, yet diverse, predictions at testtime. We achieve this by including a diversity encouraging term in the loss function used for training the models. This approach encourages diversity in the predictions while preserving computational efficiency at test-time. Experimental results on a number of challenging problems show that our method learns models that not only predict more diverse results than competing methods, but are also able to generalize better and produce results with high test accuracy.

Related Material