Marginalizing Out Transition Probabilities for Several Subclasses of PFAs
Proceedings of the Eleventh International Conference on Grammatical Inference, PMLR 21:259-263, 2012.
A Bayesian manner which marginalizes transition probabilities can be generally applied to various kinds of probabilistic finite state machine models. Based on such a Bayesian manner, we implemented and compared three algorithms: variable-length gram, state merging method for PDFAs, and collapsed Gibbs sampling for PFAs. Among those, collapsed Gibbs sampling for PFAs performed the best on the data from the pre-competition stage of PAutomaC, although it consumes large computation resources.