A Unified Framework for Jointly Learning Distributed Representations of Word and Attributes

[edit]

Liqiang Niu, Xin-Yu Dai, Shujian Huang, Jiajun Chen ;
Asian Conference on Machine Learning, PMLR 45:143-156, 2016.

Abstract

Distributed word representations have achieved great success in natural language processing (NLP) area. However, most distributed models focus on local context properties and learn task-specific representations individually, therefore lack the ability to fuse multi-attributes and learn jointly. In this paper, we propose a unified framework which jointly learns distributed representations of word and attributes: characteristics of word. In our models, we consider three types of attributes: topic, lemma and document. Besides learning distributed attribute representations, we find that using additional attributes is beneficial to improve word representations. Several experiments are conducted to evaluate the performance of the learned topic representations, document representations, and improved word representations, respectively. The experimental results show that our models achieve significant and competitive results.

Related Material