[edit]
Sparse Adaptive Dirichlet-Multinomial-like Processes
Proceedings of the 26th Annual Conference on Learning Theory, PMLR 30:432-459, 2013.
Abstract
Online estimation and modelling of i.i.d. data for shortsequences over large or complex “alphabets” is a ubiquitous (sub)problem in machine learning, information theory, data compression, statistical language processing, and document analysis. The Dirichlet-Multinomial distribution (also called Polya urn scheme) and extensions thereof are widely applied for online i.i.d. estimation. Good a-priori choices for the parameters in this regime are difficult to obtain though. I derive an optimal adaptive choice for the main parameter via tight, data-dependent redundancy bounds for a related model. The 1-line recommendation is to set the ’total mass’ = ’precision’ = ’concentration’ parameter to m/[2\ln\fracn+1m], where n is the (past) sample size and m the number of different symbols observed (so far). The resulting estimator is simple, online, fast,and experimental performance is superb.