Learning Word Representations with Hierarchical Sparse Coding

Dani Yogatama, Manaal Faruqui, Chris Dyer, Noah Smith
Proceedings of the 32nd International Conference on Machine Learning, PMLR 37:87-96, 2015.

Abstract

We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks—word similarity ranking, syntactic and semantic analogies, sentence completion, and sentiment analysis—demonstrate that the method outperforms or is competitive with state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v37-yogatama15, title = {Learning Word Representations with Hierarchical Sparse Coding}, author = {Yogatama, Dani and Faruqui, Manaal and Dyer, Chris and Smith, Noah}, booktitle = {Proceedings of the 32nd International Conference on Machine Learning}, pages = {87--96}, year = {2015}, editor = {Bach, Francis and Blei, David}, volume = {37}, series = {Proceedings of Machine Learning Research}, address = {Lille, France}, month = {07--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v37/yogatama15.pdf}, url = { http://proceedings.mlr.press/v37/yogatama15.html }, abstract = {We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks—word similarity ranking, syntactic and semantic analogies, sentence completion, and sentiment analysis—demonstrate that the method outperforms or is competitive with state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Learning Word Representations with Hierarchical Sparse Coding %A Dani Yogatama %A Manaal Faruqui %A Chris Dyer %A Noah Smith %B Proceedings of the 32nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2015 %E Francis Bach %E David Blei %F pmlr-v37-yogatama15 %I PMLR %P 87--96 %U http://proceedings.mlr.press/v37/yogatama15.html %V 37 %X We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks—word similarity ranking, syntactic and semantic analogies, sentence completion, and sentiment analysis—demonstrate that the method outperforms or is competitive with state-of-the-art methods.
RIS
TY - CPAPER TI - Learning Word Representations with Hierarchical Sparse Coding AU - Dani Yogatama AU - Manaal Faruqui AU - Chris Dyer AU - Noah Smith BT - Proceedings of the 32nd International Conference on Machine Learning DA - 2015/06/01 ED - Francis Bach ED - David Blei ID - pmlr-v37-yogatama15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 37 SP - 87 EP - 96 L1 - http://proceedings.mlr.press/v37/yogatama15.pdf UR - http://proceedings.mlr.press/v37/yogatama15.html AB - We propose a new method for learning word representations using hierarchical regularization in sparse coding inspired by the linguistic study of word meanings. We show an efficient learning algorithm based on stochastic proximal methods that is significantly faster than previous approaches, making it possible to perform hierarchical sparse coding on a corpus of billions of word tokens. Experiments on various benchmark tasks—word similarity ranking, syntactic and semantic analogies, sentence completion, and sentiment analysis—demonstrate that the method outperforms or is competitive with state-of-the-art methods. ER -
APA
Yogatama, D., Faruqui, M., Dyer, C. & Smith, N.. (2015). Learning Word Representations with Hierarchical Sparse Coding. Proceedings of the 32nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 37:87-96 Available from http://proceedings.mlr.press/v37/yogatama15.html .

Related Material