Breaking Sticks and Ambiguities with Adaptive Skip-gram

Sergey Bartunov, Dmitry Kondrashkin, Anton Osokin, Dmitry Vetrov
; Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:130-138, 2016.

Abstract

The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-bartunov16, title = {Breaking Sticks and Ambiguities with Adaptive Skip-gram}, author = {Sergey Bartunov and Dmitry Kondrashkin and Anton Osokin and Dmitry Vetrov}, pages = {130--138}, year = {2016}, editor = {Arthur Gretton and Christian C. Robert}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/bartunov16.pdf}, url = {http://proceedings.mlr.press/v51/bartunov16.html}, abstract = {The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.} }
Endnote
%0 Conference Paper %T Breaking Sticks and Ambiguities with Adaptive Skip-gram %A Sergey Bartunov %A Dmitry Kondrashkin %A Anton Osokin %A Dmitry Vetrov %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-bartunov16 %I PMLR %J Proceedings of Machine Learning Research %P 130--138 %U http://proceedings.mlr.press %V 51 %W PMLR %X The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task.
RIS
TY - CPAPER TI - Breaking Sticks and Ambiguities with Adaptive Skip-gram AU - Sergey Bartunov AU - Dmitry Kondrashkin AU - Anton Osokin AU - Dmitry Vetrov BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics PY - 2016/05/02 DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-bartunov16 PB - PMLR SP - 130 DP - PMLR EP - 138 L1 - http://proceedings.mlr.press/v51/bartunov16.pdf UR - http://proceedings.mlr.press/v51/bartunov16.html AB - The recently proposed Skip-gram model is a powerful method for learning high-dimensional word representations that capture rich semantic relationships between words. However, Skip-gram as well as most prior work on learning word representations does not take into account word ambiguity and maintain only single representation per word. Although a number of Skip-gram modifications were proposed to overcome this limitation and learn multi-prototype word representations, they either require a known number of word meanings or learn them using greedy heuristic approaches. In this paper we propose the Adaptive Skip-gram model which is a nonparametric Bayesian extension of Skip-gram capable to automatically learn the required number of representations for all words at desired semantic resolution. We derive efficient online variational learning algorithm for the model and empirically demonstrate its efficiency on word-sense induction task. ER -
APA
Bartunov, S., Kondrashkin, D., Osokin, A. & Vetrov, D.. (2016). Breaking Sticks and Ambiguities with Adaptive Skip-gram. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in PMLR 51:130-138

Related Material