A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes

Liqiang Niu, Xin-Yu Dai, Shujian Huang, Jiajun Chen
Asian Conference on Machine Learning, PMLR 45:143-156, 2016.

Abstract

Distributed word representations have achieved great success in natural language processing (NLP) area. However, most distributed models focus on local context properties and learn task-specific representations individually, therefore lack the ability to fuse multi-attributes and learn jointly. In this paper, we propose a unified framework which jointly learns distributed representations of word and attributes: characteristics of word. In our models, we consider three types of attributes: topic, lemma and document. Besides learning distributed attribute representations, we find that using additional attributes is beneficial to improve word representations. Several experiments are conducted to evaluate the performance of the learned topic representations, document representations, and improved word representations, respectively. The experimental results show that our models achieve significant and competitive results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v45-Niu15, title = {A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes}, author = {Niu, Liqiang and Dai, Xin-Yu and Huang, Shujian and Chen, Jiajun}, booktitle = {Asian Conference on Machine Learning}, pages = {143--156}, year = {2016}, editor = {Holmes, Geoffrey and Liu, Tie-Yan}, volume = {45}, series = {Proceedings of Machine Learning Research}, address = {Hong Kong}, month = {20--22 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v45/Niu15.pdf}, url = {https://proceedings.mlr.press/v45/Niu15.html}, abstract = {Distributed word representations have achieved great success in natural language processing (NLP) area. However, most distributed models focus on local context properties and learn task-specific representations individually, therefore lack the ability to fuse multi-attributes and learn jointly. In this paper, we propose a unified framework which jointly learns distributed representations of word and attributes: characteristics of word. In our models, we consider three types of attributes: topic, lemma and document. Besides learning distributed attribute representations, we find that using additional attributes is beneficial to improve word representations. Several experiments are conducted to evaluate the performance of the learned topic representations, document representations, and improved word representations, respectively. The experimental results show that our models achieve significant and competitive results. } }
Endnote
%0 Conference Paper %T A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes %A Liqiang Niu %A Xin-Yu Dai %A Shujian Huang %A Jiajun Chen %B Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Geoffrey Holmes %E Tie-Yan Liu %F pmlr-v45-Niu15 %I PMLR %P 143--156 %U https://proceedings.mlr.press/v45/Niu15.html %V 45 %X Distributed word representations have achieved great success in natural language processing (NLP) area. However, most distributed models focus on local context properties and learn task-specific representations individually, therefore lack the ability to fuse multi-attributes and learn jointly. In this paper, we propose a unified framework which jointly learns distributed representations of word and attributes: characteristics of word. In our models, we consider three types of attributes: topic, lemma and document. Besides learning distributed attribute representations, we find that using additional attributes is beneficial to improve word representations. Several experiments are conducted to evaluate the performance of the learned topic representations, document representations, and improved word representations, respectively. The experimental results show that our models achieve significant and competitive results.
RIS
TY - CPAPER TI - A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes AU - Liqiang Niu AU - Xin-Yu Dai AU - Shujian Huang AU - Jiajun Chen BT - Asian Conference on Machine Learning DA - 2016/02/25 ED - Geoffrey Holmes ED - Tie-Yan Liu ID - pmlr-v45-Niu15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 45 SP - 143 EP - 156 L1 - http://proceedings.mlr.press/v45/Niu15.pdf UR - https://proceedings.mlr.press/v45/Niu15.html AB - Distributed word representations have achieved great success in natural language processing (NLP) area. However, most distributed models focus on local context properties and learn task-specific representations individually, therefore lack the ability to fuse multi-attributes and learn jointly. In this paper, we propose a unified framework which jointly learns distributed representations of word and attributes: characteristics of word. In our models, we consider three types of attributes: topic, lemma and document. Besides learning distributed attribute representations, we find that using additional attributes is beneficial to improve word representations. Several experiments are conducted to evaluate the performance of the learned topic representations, document representations, and improved word representations, respectively. The experimental results show that our models achieve significant and competitive results. ER -
APA
Niu, L., Dai, X., Huang, S. & Chen, J.. (2016). A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes. Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 45:143-156 Available from https://proceedings.mlr.press/v45/Niu15.html.

Related Material