A Bayesian Divergence Prior for Classiffier Adaptation

[edit]

Xiao Li, Jeff Bilmes ;
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:275-282, 2007.

Abstract

Adaptation of statistical classifiers is critical when a target (or testing) distribution is different from the distribution that governs training data. In such cases, a classifier optimized for the training distribution needs to be adapted for optimal use in the target distribution. This paper presents a Bayesian “divergence prior” for generic classifier adaptation. Instantiations of this prior lead to simple yet principled adaptation strategies for a variety of classifiers, which yield superior performance in practice. In addition, this paper derives several adaptation error bounds by applying the divergence prior in the PAC-Bayesian setting.

Related Material