Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis

Xianghua Fu, Wangwang Liu, Yingying Xu, Chong Yu, Ting Wang
Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:17-32, 2016.

Abstract

Using deep learning models to solve sentiment analysis of sentences is still a challenging task. Long short-term memory (LSTM) network solves the gradient disappeared problem existed in recurrent neural network (RNN), but LSTM structure is linear chain-structure that can’t capture text structure information. Afterwards, Tree-LSTM is proposed, which uses LSTM forget gate to skip sub-trees that have little effect on the results to get good performance. It illustrates that the chain-structured LSTM more strongly depends on text structure. However, Tree-LSTM can’t clearly figure out which sub-trees are important and which sub-trees have little effect. We propose a simple model which uses Rhetorical Structure Theory (RST) for text parsing. By building LSTM network on RST parse structure, we make full use of LSTM structural characteristics to automatically enhance the nucleus information and filter the satellite information of text. Furthermore, this approach can make the representations concerning the relations between segments of text, which can improve text semantic representations. Experiment results show that this method not only has higher classification accuracy, but also trains quickly.

Cite this Paper


BibTeX
@InProceedings{pmlr-v63-Fu62, title = {Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis}, author = {Fu, Xianghua and Liu, Wangwang and Xu, Yingying and Yu, Chong and Wang, Ting}, booktitle = {Proceedings of The 8th Asian Conference on Machine Learning}, pages = {17--32}, year = {2016}, editor = {Durrant, Robert J. and Kim, Kee-Eung}, volume = {63}, series = {Proceedings of Machine Learning Research}, address = {The University of Waikato, Hamilton, New Zealand}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v63/Fu62.pdf}, url = {https://proceedings.mlr.press/v63/Fu62.html}, abstract = {Using deep learning models to solve sentiment analysis of sentences is still a challenging task. Long short-term memory (LSTM) network solves the gradient disappeared problem existed in recurrent neural network (RNN), but LSTM structure is linear chain-structure that can’t capture text structure information. Afterwards, Tree-LSTM is proposed, which uses LSTM forget gate to skip sub-trees that have little effect on the results to get good performance. It illustrates that the chain-structured LSTM more strongly depends on text structure. However, Tree-LSTM can’t clearly figure out which sub-trees are important and which sub-trees have little effect. We propose a simple model which uses Rhetorical Structure Theory (RST) for text parsing. By building LSTM network on RST parse structure, we make full use of LSTM structural characteristics to automatically enhance the nucleus information and filter the satellite information of text. Furthermore, this approach can make the representations concerning the relations between segments of text, which can improve text semantic representations. Experiment results show that this method not only has higher classification accuracy, but also trains quickly.} }
Endnote
%0 Conference Paper %T Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis %A Xianghua Fu %A Wangwang Liu %A Yingying Xu %A Chong Yu %A Ting Wang %B Proceedings of The 8th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Robert J. Durrant %E Kee-Eung Kim %F pmlr-v63-Fu62 %I PMLR %P 17--32 %U https://proceedings.mlr.press/v63/Fu62.html %V 63 %X Using deep learning models to solve sentiment analysis of sentences is still a challenging task. Long short-term memory (LSTM) network solves the gradient disappeared problem existed in recurrent neural network (RNN), but LSTM structure is linear chain-structure that can’t capture text structure information. Afterwards, Tree-LSTM is proposed, which uses LSTM forget gate to skip sub-trees that have little effect on the results to get good performance. It illustrates that the chain-structured LSTM more strongly depends on text structure. However, Tree-LSTM can’t clearly figure out which sub-trees are important and which sub-trees have little effect. We propose a simple model which uses Rhetorical Structure Theory (RST) for text parsing. By building LSTM network on RST parse structure, we make full use of LSTM structural characteristics to automatically enhance the nucleus information and filter the satellite information of text. Furthermore, this approach can make the representations concerning the relations between segments of text, which can improve text semantic representations. Experiment results show that this method not only has higher classification accuracy, but also trains quickly.
RIS
TY - CPAPER TI - Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis AU - Xianghua Fu AU - Wangwang Liu AU - Yingying Xu AU - Chong Yu AU - Ting Wang BT - Proceedings of The 8th Asian Conference on Machine Learning DA - 2016/11/20 ED - Robert J. Durrant ED - Kee-Eung Kim ID - pmlr-v63-Fu62 PB - PMLR DP - Proceedings of Machine Learning Research VL - 63 SP - 17 EP - 32 L1 - http://proceedings.mlr.press/v63/Fu62.pdf UR - https://proceedings.mlr.press/v63/Fu62.html AB - Using deep learning models to solve sentiment analysis of sentences is still a challenging task. Long short-term memory (LSTM) network solves the gradient disappeared problem existed in recurrent neural network (RNN), but LSTM structure is linear chain-structure that can’t capture text structure information. Afterwards, Tree-LSTM is proposed, which uses LSTM forget gate to skip sub-trees that have little effect on the results to get good performance. It illustrates that the chain-structured LSTM more strongly depends on text structure. However, Tree-LSTM can’t clearly figure out which sub-trees are important and which sub-trees have little effect. We propose a simple model which uses Rhetorical Structure Theory (RST) for text parsing. By building LSTM network on RST parse structure, we make full use of LSTM structural characteristics to automatically enhance the nucleus information and filter the satellite information of text. Furthermore, this approach can make the representations concerning the relations between segments of text, which can improve text semantic representations. Experiment results show that this method not only has higher classification accuracy, but also trains quickly. ER -
APA
Fu, X., Liu, W., Xu, Y., Yu, C. & Wang, T.. (2016). Long Short-term Memory Network over Rhetorical Structure Theory for Sentence-level Sentiment Analysis. Proceedings of The 8th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 63:17-32 Available from https://proceedings.mlr.press/v63/Fu62.html.

Related Material