[edit]
Treba: Efficient Numerically Stable EM for PFA
Proceedings of the Eleventh International Conference on Grammatical Inference, PMLR 21:249-253, 2012.
Abstract
Training probabilistic finite automata with the EM/Baum-Welch algorithm is computationally very intensive, especially if random ergodic automata are used initially, and additional strategies such as deterministic annealing are used. In this paper we present some optimization and parallelization strategies to the Baum-Welch algorithm that often allow for training of much larger automata with a larger number of observations. The tool, \emphtreba, which implements the optimizations, is available open-source and its results were used to participate in the PAutomaC PFA/HMM competition.