A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation

Frank Wood, Yee Whye Teh
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:607-614, 2009.

Abstract

In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the ?adaptation? of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-wood09a, title = {A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation}, author = {Frank Wood and Yee Whye Teh}, pages = {607--614}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wood09a/wood09a.pdf}, url = {http://proceedings.mlr.press/v5/wood09a.html}, abstract = {In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the ?adaptation? of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.} }
Endnote
%0 Conference Paper %T A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation %A Frank Wood %A Yee Whye Teh %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-wood09a %I PMLR %J Proceedings of Machine Learning Research %P 607--614 %U http://proceedings.mlr.press %V 5 %W PMLR %X In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the ?adaptation? of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry.
RIS
TY - CPAPER TI - A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation AU - Frank Wood AU - Yee Whye Teh BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-wood09a PB - PMLR SP - 607 DP - PMLR EP - 614 L1 - http://proceedings.mlr.press/v5/wood09a/wood09a.pdf UR - http://proceedings.mlr.press/v5/wood09a.html AB - In this paper we present a doubly hierarchical Pitman-Yor process language model. Its bottom layer of hierarchy consists of multiple hierarchical Pitman-Yor process language models, one each for some number of domains. The novel top layer of hierarchy consists of a mechanism to couple together multiple language models such that they share statistical strength. Intuitively this sharing results in the ?adaptation? of a latent shared language model to each domain. We introduce a general formalism capable of describing the overall model which we call the graphical Pitman-Yor process and explain how to perform Bayesian inference in it. We present encouraging language model domain adaptation results that both illustrate the potential benefits of our new model and suggest new avenues of inquiry. ER -
APA
Wood, F. & Teh, Y.W.. (2009). A Hierarchical Nonparametric Bayesian Approach to Statistical Language Model Domain Adaptation. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:607-614

Related Material