[edit]
Simple Variable Length N-grams for Probabilistic Automata Learning
Proceedings of the Eleventh International Conference on Grammatical Inference, PMLR 21:254-258, 2012.
Abstract
This paper describes an approach used in the 2012 Probabilistic Automata Learning Competition. The main goal of the competition was to obtain insights about which techniques and approaches work best for sequence learning based on different kinds of automata generating machines. This paper proposes the usage of n-gram models with variable length. Experiments show that, using the test sets provided by the competition, the variable-length approach works better than fixed 3-grams.