Check out RSS, or use RSS reader to subscribe this item
Confirmation
Authentication email has already been sent, please check your email box: and activate it as soon as possible.
You can login to My Profile and manage your email alerts.
Sponsored by the Center for Science and Technology Development of the Ministry of Education
Supervised by Ministry of Education of the People's Republic of China
1.Institute of Network Technology, Beijing University of Posts and Telecommunications, Beijing 100876;Institute of Network Technology, Beijing University of Posts and Telecommunications, Beijing 100876
In this paper, we propose an attention based structure to do sentence encoding in natural language inference(NLI) task. This attention strcture do soft-alignment between word features and character features, encoding both of them to a joint feature vector space, it helps models handle more unknown words\' information, enhance the model performance in some specific situations. Our experiments show that this structure work well in Stanford Natural Language Inference(SNLI) dataset and The Multi-Genre Natural Language Inference(MultiNLI) corpus.
Keywords:natural language process; natural language inference; deep learning; attention