A Hierarchy of Context-Free Languages Learnable from Positive Data and Membership Queries
Proceedings of the Fifteenth International Conference on Grammatical Inference, PMLR 153:18-31, 2021.
We consider a generalization of the “dual” approach to distributional learning of context-free grammars, where each nonterminal $A$ is associated with a string set $X_A$ characterized by a finite set $C$ of contexts. Rather than letting $X_A$ be the set of all strings accepted by all contexts in $C$ as in previous works, we allow more flexible uses of the contexts in $C$, using some of them positively (contexts that accept the strings in $X_A$) and others negatively (contexts that do not accept any strings in $X_A$). The resulting more general algorithm works in essentially the same way as before, but on a larger class of context-free languages.