[edit]
A Joint Selective Mechanism for Abstractive Sentence Summarization
Proceedings of The 10th Asian Conference on Machine Learning, PMLR 95:756-769, 2018.
Abstract
Sequence-to-sequence (Seq2Seq) learning framework has been widely used in many natural language processing (NLP) tasks, including abstractive summarization and machine translation (MT). However, abstractive summarization generates the output in a lossy manner, in comparison with MT which is almost loss-less. We model this by introducing a joint selective mechanism: (i) A selective gate is added after encoding phase of the Seq2Seq learning framework, which learns to tailor the original input information and generates a selected input representation. (ii) A selection loss function is also added to help our selective gate function well, which is computed by looking at the input and the output jointly. Experimental results show that our proposed model outperforms most of the baseline models and is comparable to the state-of-the-art model in automatic evaluations.