中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification

文献类型:期刊论文

作者Yaojie Zhang; Bing Xu; Tiejun Zhao
刊名IEEE/CAA Journal of Automatica Sinica
出版日期2020
卷号7期号:4页码:1038-1044
关键词Aspect sentiment classification deep learning memory network sentiment analysis (SA)
ISSN号2329-9266
DOI10.1109/JAS.2020.1003243
英文摘要This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network (CMA-MemNet). This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network’s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network (RNN) long short term memory (LSTM), gated recurrent unit (GRU) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
源URL[http://ir.ia.ac.cn/handle/173211/43011]  
专题自动化研究所_学术期刊_IEEE/CAA Journal of Automatica Sinica
推荐引用方式
GB/T 7714
Yaojie Zhang,Bing Xu,Tiejun Zhao. Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification[J]. IEEE/CAA Journal of Automatica Sinica,2020,7(4):1038-1044.
APA Yaojie Zhang,Bing Xu,&Tiejun Zhao.(2020).Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification.IEEE/CAA Journal of Automatica Sinica,7(4),1038-1044.
MLA Yaojie Zhang,et al."Convolutional Multi-Head Self-Attention on Memory for Aspect Sentiment Classification".IEEE/CAA Journal of Automatica Sinica 7.4(2020):1038-1044.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。