A Unified Statistically Efficient Estimation Framework for Unnormalized Models

[edit]

Masatoshi Uehara, Takafumi Kanamori, Takashi Takenouchi, Takeru Matsuda ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:809-819, 2020.

Abstract

The parameter estimation of unnormalized models is a challenging problem. The maximum likelihood estimation (MLE) is computationally infeasible for these models since normalizing constants are not explicitly calculated. Although some consistent estimators have been proposed earlier, the problem of statistical efficiency remains. In this study, we propose a unified, statistically efficient estimation framework for unnormalized models and several efficient estimators, whose asymptotic variance is the same as the MLE. The computational cost of these estimators is also reasonable and they can be employed whether the sample space is discrete or continuous. The loss functions of the proposed estimators are derived by combining the following two methods: (1) density-ratio matching using Bregman divergence, and (2) plugging-in nonparametric estimators. We also analyze the properties of the proposed estimators when the unnormalized models are misspecified. The experimental results demonstrate the advantages of our method over existing approaches.

Related Material