Home > Papers

 
 
Sentence Encoding: attention with another self
Zhang Xinnan 1,Li Wei 2 *
1.Institute of Network Technology, Beijing University of Posts and Telecommunications, Beijing 100876;Institute of Network Technology, Beijing University of Posts and Telecommunications, Beijing 100876
2.
*Correspondence author
#Submitted by
Subject:
Funding: none
Opened online:24 January 2019
Accepted by: none
Citation: Zhang Xinnan,Li Wei.Sentence Encoding: attention with another self[OL]. [24 January 2019] http://en.paper.edu.cn/en_releasepaper/content/4747072
 
 
In this paper, we propose an attention based structure to do sentence encoding in natural language inference(NLI) task. This attention strcture do soft-alignment between word features and character features, encoding both of them to a joint feature vector space, it helps models handle more unknown words\' information, enhance the model performance in some specific situations. Our experiments show that this structure work well in Stanford Natural Language Inference(SNLI) dataset and The Multi-Genre Natural Language Inference(MultiNLI) corpus.
Keywords:natural language process; natural language inference; deep learning; attention
 
 
 

For this paper

  • PDF (0B)
  • ● Revision 0   
  • ● Print this paper
  • ● Recommend this paper to a friend
  • ● Add to my favorite list

    Saved Papers

    Please enter a name for this paper to be shown in your personalized Saved Papers list

Tags

Add yours

Related Papers

Statistics

PDF Downloaded 113
Bookmarked 0
Recommend 0
Comments Array
Submit your papers