Treba: Efficient Numerically Stable EM for PFA

Mans Hulden
Proceedings of the Eleventh International Conference on Grammatical Inference, PMLR 21:249-253, 2012.

Abstract

Training probabilistic finite automata with the EM/Baum-Welch algorithm is computationally very intensive, especially if random ergodic automata are used initially, and additional strategies such as deterministic annealing are used. In this paper we present some optimization and parallelization strategies to the Baum-Welch algorithm that often allow for training of much larger automata with a larger number of observations. The tool, \emphtreba, which implements the optimizations, is available open-source and its results were used to participate in the PAutomaC PFA/HMM competition.

Cite this Paper


BibTeX
@InProceedings{pmlr-v21-hulden12a, title = {Treba: Efficient Numerically Stable EM for PFA}, author = {Hulden, Mans}, booktitle = {Proceedings of the Eleventh International Conference on Grammatical Inference}, pages = {249--253}, year = {2012}, editor = {Heinz, Jeffrey and Higuera, Colin and Oates, Tim}, volume = {21}, series = {Proceedings of Machine Learning Research}, address = {University of Maryland, College Park, MD, USA}, month = {05--08 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v21/hulden12a/hulden12a.pdf}, url = {https://proceedings.mlr.press/v21/hulden12a.html}, abstract = {Training probabilistic finite automata with the EM/Baum-Welch algorithm is computationally very intensive, especially if random ergodic automata are used initially, and additional strategies such as deterministic annealing are used. In this paper we present some optimization and parallelization strategies to the Baum-Welch algorithm that often allow for training of much larger automata with a larger number of observations. The tool, \emphtreba, which implements the optimizations, is available open-source and its results were used to participate in the PAutomaC PFA/HMM competition.} }
Endnote
%0 Conference Paper %T Treba: Efficient Numerically Stable EM for PFA %A Mans Hulden %B Proceedings of the Eleventh International Conference on Grammatical Inference %C Proceedings of Machine Learning Research %D 2012 %E Jeffrey Heinz %E Colin Higuera %E Tim Oates %F pmlr-v21-hulden12a %I PMLR %P 249--253 %U https://proceedings.mlr.press/v21/hulden12a.html %V 21 %X Training probabilistic finite automata with the EM/Baum-Welch algorithm is computationally very intensive, especially if random ergodic automata are used initially, and additional strategies such as deterministic annealing are used. In this paper we present some optimization and parallelization strategies to the Baum-Welch algorithm that often allow for training of much larger automata with a larger number of observations. The tool, \emphtreba, which implements the optimizations, is available open-source and its results were used to participate in the PAutomaC PFA/HMM competition.
RIS
TY - CPAPER TI - Treba: Efficient Numerically Stable EM for PFA AU - Mans Hulden BT - Proceedings of the Eleventh International Conference on Grammatical Inference DA - 2012/08/16 ED - Jeffrey Heinz ED - Colin Higuera ED - Tim Oates ID - pmlr-v21-hulden12a PB - PMLR DP - Proceedings of Machine Learning Research VL - 21 SP - 249 EP - 253 L1 - http://proceedings.mlr.press/v21/hulden12a/hulden12a.pdf UR - https://proceedings.mlr.press/v21/hulden12a.html AB - Training probabilistic finite automata with the EM/Baum-Welch algorithm is computationally very intensive, especially if random ergodic automata are used initially, and additional strategies such as deterministic annealing are used. In this paper we present some optimization and parallelization strategies to the Baum-Welch algorithm that often allow for training of much larger automata with a larger number of observations. The tool, \emphtreba, which implements the optimizations, is available open-source and its results were used to participate in the PAutomaC PFA/HMM competition. ER -
APA
Hulden, M.. (2012). Treba: Efficient Numerically Stable EM for PFA. Proceedings of the Eleventh International Conference on Grammatical Inference, in Proceedings of Machine Learning Research 21:249-253 Available from https://proceedings.mlr.press/v21/hulden12a.html.

Related Material