中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram

文献类型:期刊论文

作者Peng, Zhichao1; He, Wenhua1; Li, Yongwei5; Du, Yegang4; Dang, Jianwu2,3
刊名APPLIED SCIENCES-BASEL
出版日期2023-06-01
卷号13期号:11页码:16
关键词categorical emotion recognition auditory signal processing modulation-filtered cochleagram multi-level attention
DOI10.3390/app13116749
通讯作者Peng, Zhichao(zcpeng@tju.edu.cn) ; Dang, Jianwu(jdang@jaist.ac.jp)
英文摘要Speech emotion recognition is a critical component for achieving natural human-robot interaction. The modulation-filtered cochleagram is a feature based on auditory modulation perception, which contains multi-dimensional spectral-temporal modulation representation. In this study, we propose an emotion recognition framework that utilizes a multi-level attention network to extract high-level emotional feature representations from the modulation-filtered cochleagram. Our approach utilizes channel-level attention and spatial-level attention modules to generate emotional saliency maps of channel and spatial feature representations, capturing significant emotional channel and feature space from the 3D convolution feature maps, respectively. Furthermore, we employ a temporal-level attention module to capture significant emotional regions from the concatenated feature sequence of the emotional saliency maps. Our experiments on the Interactive Emotional Dyadic Motion Capture (IEMOCAP) dataset demonstrate that the modulation-filtered cochleagram significantly improves the prediction performance of categorical emotion compared to other evaluated features. Moreover, our emotion recognition framework achieves comparable unweighted accuracy of 71% in categorical emotion recognition by comparing with several existing approaches. In summary, our study demonstrates the effectiveness of the modulation-filtered cochleagram in speech emotion recognition, and our proposed multi-level attention framework provides a promising direction for future research in this field.
WOS关键词FEATURES
资助项目Hunan Provincial Natural Science Foundation of China[2021JJ30379] ; Youth Fund of the National Natural Science Foundation of China[62201571]
WOS研究方向Chemistry ; Engineering ; Materials Science ; Physics
语种英语
WOS记录号WOS:001003438300001
出版者MDPI
资助机构Hunan Provincial Natural Science Foundation of China ; Youth Fund of the National Natural Science Foundation of China
源URL[http://ir.ia.ac.cn/handle/173211/53498]  
专题模式识别国家重点实验室_智能交互
通讯作者Peng, Zhichao; Dang, Jianwu
作者单位1.Hunan Univ Humanities Sci & Technol, Informat Sch, Loudi 417000, Peoples R China
2.Pengcheng Lab, Shenzhen 518055, Peoples R China
3.Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
4.Waseda Univ, Future Robot Org, Tokyo 1698050, Japan
5.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100045, Peoples R China
推荐引用方式
GB/T 7714
Peng, Zhichao,He, Wenhua,Li, Yongwei,et al. Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram[J]. APPLIED SCIENCES-BASEL,2023,13(11):16.
APA Peng, Zhichao,He, Wenhua,Li, Yongwei,Du, Yegang,&Dang, Jianwu.(2023).Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram.APPLIED SCIENCES-BASEL,13(11),16.
MLA Peng, Zhichao,et al."Multi-Level Attention-Based Categorical Emotion Recognition Using Modulation-Filtered Cochleagram".APPLIED SCIENCES-BASEL 13.11(2023):16.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。