Grammar Induction: Beyond Local Search

Jason Eisner
Proceedings of the Eleventh International Conference on Grammatical Inference, PMLR 21:112-113, 2012.

Abstract

Many approaches to probabilistic grammar induction operate by iteratively improving a single grammar, beginning with an initial guess. These local search paradigms include (variational) EM, MCMC, and greedy model merging or splitting procedures. Unfortunately, local search methods tend to get caught in local optima, even with random restarts. Two approaches are outlined that try to avoid this problem. One uses branch-and-bound methods from mathematical programming to eliminate regions of parameter space that cannot contain the global optimum. The other is inspired by recent work on deep learning, and uses spectral methods to build up featural representations of all substrings, without premature commitment to which substrings are constituents.

Cite this Paper


BibTeX
@InProceedings{pmlr-v21-eisner12a, title = {Grammar Induction: Beyond Local Search}, author = {Eisner, Jason}, booktitle = {Proceedings of the Eleventh International Conference on Grammatical Inference}, pages = {112--113}, year = {2012}, editor = {Heinz, Jeffrey and Higuera, Colin and Oates, Tim}, volume = {21}, series = {Proceedings of Machine Learning Research}, address = {University of Maryland, College Park, MD, USA}, month = {05--08 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v21/eisner12a/eisner12a.pdf}, url = {https://proceedings.mlr.press/v21/eisner12a.html}, abstract = {Many approaches to probabilistic grammar induction operate by iteratively improving a single grammar, beginning with an initial guess. These local search paradigms include (variational) EM, MCMC, and greedy model merging or splitting procedures. Unfortunately, local search methods tend to get caught in local optima, even with random restarts. Two approaches are outlined that try to avoid this problem. One uses branch-and-bound methods from mathematical programming to eliminate regions of parameter space that cannot contain the global optimum. The other is inspired by recent work on deep learning, and uses spectral methods to build up featural representations of all substrings, without premature commitment to which substrings are constituents.} }
Endnote
%0 Conference Paper %T Grammar Induction: Beyond Local Search %A Jason Eisner %B Proceedings of the Eleventh International Conference on Grammatical Inference %C Proceedings of Machine Learning Research %D 2012 %E Jeffrey Heinz %E Colin Higuera %E Tim Oates %F pmlr-v21-eisner12a %I PMLR %P 112--113 %U https://proceedings.mlr.press/v21/eisner12a.html %V 21 %X Many approaches to probabilistic grammar induction operate by iteratively improving a single grammar, beginning with an initial guess. These local search paradigms include (variational) EM, MCMC, and greedy model merging or splitting procedures. Unfortunately, local search methods tend to get caught in local optima, even with random restarts. Two approaches are outlined that try to avoid this problem. One uses branch-and-bound methods from mathematical programming to eliminate regions of parameter space that cannot contain the global optimum. The other is inspired by recent work on deep learning, and uses spectral methods to build up featural representations of all substrings, without premature commitment to which substrings are constituents.
RIS
TY - CPAPER TI - Grammar Induction: Beyond Local Search AU - Jason Eisner BT - Proceedings of the Eleventh International Conference on Grammatical Inference DA - 2012/08/16 ED - Jeffrey Heinz ED - Colin Higuera ED - Tim Oates ID - pmlr-v21-eisner12a PB - PMLR DP - Proceedings of Machine Learning Research VL - 21 SP - 112 EP - 113 L1 - http://proceedings.mlr.press/v21/eisner12a/eisner12a.pdf UR - https://proceedings.mlr.press/v21/eisner12a.html AB - Many approaches to probabilistic grammar induction operate by iteratively improving a single grammar, beginning with an initial guess. These local search paradigms include (variational) EM, MCMC, and greedy model merging or splitting procedures. Unfortunately, local search methods tend to get caught in local optima, even with random restarts. Two approaches are outlined that try to avoid this problem. One uses branch-and-bound methods from mathematical programming to eliminate regions of parameter space that cannot contain the global optimum. The other is inspired by recent work on deep learning, and uses spectral methods to build up featural representations of all substrings, without premature commitment to which substrings are constituents. ER -
APA
Eisner, J.. (2012). Grammar Induction: Beyond Local Search. Proceedings of the Eleventh International Conference on Grammatical Inference, in Proceedings of Machine Learning Research 21:112-113 Available from https://proceedings.mlr.press/v21/eisner12a.html.

Related Material