Home > Papers

 
 
A Dual-Attentive and Hybrid Word-Character Model for Chinese Short Text Summarization
Li Yufeng 1,Xu Weiran 2 *
1.School of Artificial Intelligence, Beijing University of Posts and Telecommunications, 100876;School of Artificial Intelligence, Beijing University of Posts and Telecommunications, 100876
2.School of Artificial Intelligence, Beijing University of Posts and Telecommunications, 100876
*Correspondence author
#Submitted by
Subject:
Funding: none
Opened online:31 December 2020
Accepted by: none
Citation: Li Yufeng,Xu Weiran.A Dual-Attentive and Hybrid Word-Character Model for Chinese Short Text Summarization[OL]. [31 December 2020] http://en.paper.edu.cn/en_releasepaper/content/4753279
 
 
Automatic text summarization is an important field for NLP, which includes the extractive and the abstractive method. Among many languages, Chinese has many special properties, such as rich character semantic expressions, flexible abbreviation. Moreover, insufficient training samples are also a problem. In this paper, we propose a dual-attentive and word-character Chinese text summarization model. The hybrid word-character approach (HWC) will preserve the advantages of both word based and character-based representations, which are very suitable for Chinese language. And the extractive and abstractive methods are combined to accurately capture the key information and gain the essence of articles with less supervised samples. We evaluate our model using the ROUGE evaluation on a widely used Chinese Dataset LCSTS2.0. The experimental results show that the model is very effective.
Keywords:natural language processing; Chinese text summarization; seq2seq; attention mechanism
 
 
 

For this paper

  • PDF (0B)
  • ● Revision 0   
  • ● Print this paper
  • ● Recommend this paper to a friend
  • ● Add to my favorite list

    Saved Papers

    Please enter a name for this paper to be shown in your personalized Saved Papers list

Tags

Add yours

Related Papers

Statistics

PDF Downloaded 47
Bookmarked 0
Recommend 0
Comments Array
Submit your papers