\MetaEmotionNet: Spatial-Spectral-Temporal-Based Attention 3-D Dense Network With Meta-Learning for EEG Emotion Recognition
文献类型:期刊论文
作者 | Ning, Xiaojun6; Wang, Jing6; Lin, Youfang6; Cai, Xiyang5; Chen, Haobin6; Gou, Haijun6; Li, Xiaoli3,4; Jia, Ziyu1,2 |
刊名 | IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT |
出版日期 | 2024 |
卷号 | 73页码:13 |
ISSN号 | 0018-9456 |
关键词 | Affective computing attention mechanism electroencephalogram (EEG) emotion recognition meta-learning |
DOI | 10.1109/TIM.2023.3338676 |
通讯作者 | Jia, Ziyu(jia.ziyu@outlook.com) |
英文摘要 | Emotion recognition has become an important area in affective computing. Emotion recognition based on multichannel electroencephalogram (EEG) signals has gradually become popular in recent years. However, on one hand, how to make full use of different EEG features and the discriminative local patterns among the features for various emotions is challenging. Existing methods ignore the complementarity among the spatial-spectral-temporal features and discriminative local patterns in all features, which limits the classification performance. On the other hand, when dealing with cross-subject emotion recognition, existing transfer learning (TL) methods need a lot of training data. At the same time, it is extremely expensive and time-consuming to collect the labeled EEG data, which is not conducive to the wide application of emotion recognition models for new subjects. To solve the above challenges, we propose a novel spatial-spectral-temporal-based attention 3-D dense network (SST-Net) with meta-learning, named MetaEmotionNet, for emotion recognition. Specifically, MetaEmotionNet integrates the spatial-spectral-temporal features simultaneously in a unified network framework through two-stream fusion. At the same time, the 3-D attention mechanism can adaptively explore discriminative local patterns. In addition, a meta-learning algorithm is applied to reduce dependence on training data. Experiments demonstrate that the MetaEmotionNet is superior to the baseline models on two benchmark datasets. |
资助项目 | National Natural Science Foundation of China[62306317] ; National Natural Science Foundation of China[61603029] ; China Postdoctoral Science Foundation[2023M733738] |
WOS研究方向 | Engineering ; Instruments & Instrumentation |
语种 | 英语 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
WOS记录号 | WOS:001132683400228 |
资助机构 | National Natural Science Foundation of China ; China Postdoctoral Science Foundation |
源URL | [http://ir.ia.ac.cn/handle/173211/55467] |
专题 | 脑图谱与类脑智能实验室 |
通讯作者 | Jia, Ziyu |
作者单位 | 1.Chinese Acad Sci, Inst Automat, Brainnetome Ctr, Beijing 100190, Peoples R China 2.Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing 100044, Peoples R China 3.Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 308232, Singapore 4.ASTAR, Inst Infocomm Res, Singapore 138632, Singapore 5.Univ Calif Los Angeles, Samueli Sch Engn, Los Angeles, CA 90095 USA 6.Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing Key Lab Traff Data Anal & Min, Beijing 100044, Peoples R China |
推荐引用方式 GB/T 7714 | Ning, Xiaojun,Wang, Jing,Lin, Youfang,et al. \MetaEmotionNet: Spatial-Spectral-Temporal-Based Attention 3-D Dense Network With Meta-Learning for EEG Emotion Recognition[J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT,2024,73:13. |
APA | Ning, Xiaojun.,Wang, Jing.,Lin, Youfang.,Cai, Xiyang.,Chen, Haobin.,...&Jia, Ziyu.(2024).\MetaEmotionNet: Spatial-Spectral-Temporal-Based Attention 3-D Dense Network With Meta-Learning for EEG Emotion Recognition.IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT,73,13. |
MLA | Ning, Xiaojun,et al."\MetaEmotionNet: Spatial-Spectral-Temporal-Based Attention 3-D Dense Network With Meta-Learning for EEG Emotion Recognition".IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT 73(2024):13. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。