中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
心脑交互视角下基于心跳间期识别情绪

文献类型:学位论文

作者黄鑫
答辩日期2024-06
文献子类博士
授予单位中国科学院大学
授予地点中国科学院心理研究所
其他责任者刘正奎
关键词心跳间期 情绪识别 机器学习 心脑交互 情绪理论
学位名称理学博士
学位专业应用心理学
其他题名Recognizing emotions based on heartbeat intervals from the perspective of heart-brain interaction
中文摘要In recent years, emotion recognition has emerged as a research hotspot within the domains of psychology and neuroscience. Cardiac signals, as a pivotal physiological metric, have been extensively employed in the study of emotion recognition (referred to as cardiac dynamics characteristics in the current research). Models based on cardiac dynamics for emotion recognition generally exhibit the following shortcomings: 1) a lack of systematic evaluation of feature effectiveness; 2) an absence of ecological validity assessments of models in real-world settings; 3) insufficient understanding of the relationship between features in emotional activities and brain activity; 4) a shortfall in testing emotion composition theories. To address these issues, the present research conducts three studies to explore the efficacy of cardiac dynamics characteristics for emotion recognition, the applicability of the model under real-world conditions, the correlation between cardiac dynamics characteristics and brain activity from a neurocardiac interaction perspective, and to verify the theory of emotional dimensions. In Study 1, we recruited 64 university students as participants to collect ECG data under different emotional valence and arousal conditions, constructing emotional recognition models using traditional HRV metrics alongside emerging sympathetic and parasympathetic nervous indices (SAI and PAI-related features), achieving accuracies of 82.00% and 76.21% in four-class emotional valence models, and accuracies of 93.33% and 97.50% in two-class arousal models. Features such as sai_nni_20, pai_nni_50, pai_nni_20, HRV_CMSEn, sai_acf_lhalf, and HRV_SI were identified as more significant. In Study 2a, the predictive efficacy of cardiac dynamics characteristics based on Photoplethysmography (PPG) data was compared to Electrocardiography (ECG) data involving 47 university students. Models based on PPG data achieved maximal prediction accuracies of 78.65% for four-class emotional valence and 88.40% for two-class arousal, which, though lower than those based on ECG data, showed relatively good predictive performance and are applicable for real-world scenarios. In terms of algorithms, deep learning models demonstrated superior predictive capabilities. Study 2b was conducted in a real-world setting among 2,687 university students. Wrist-worn wearable devices were utilized to collect PPG signals. The models built upon cardiac dynamics features achieved highest accuracies of 87.85% for three-class emotional valence and 98.01% for two-class arousal, with the increased volume of real-world data significantly enhancing predictive power. Study 2b was conducted in a real-world setting among 2,687 university students. Wrist-worn wearable devices were utilized to collect PPG signals. The models built upon cardiac dynamics features achieved highest accuracies of 87.85% for three-class emotional valence and 98.01% for two-class arousal, with the increased volume of real-world data significantly enhancing predictive power. brain regions associated with cardiac dynamics features under different emotional valence and arousal conditions. Three typical brain regions were identified to be associated with varying emotional valence, and two typical brain regions were identified to be associated with different arousal levels. The mutual complementarity and clustering of the feature-related brain regions reflect that the brain plays an integrated role in emotional activities, and future clustering may aid model improvement. In Study 3b, by using the same participants as Study 3a, we employed the HEP index to examine the emotion "valence-arousal" model from a neurocardiac interaction perspective. Results indicated that 1) there were significant differences in the HEP index across the frontal lobe, occipital lobe, and other regions of the left hemisphere related to different emotional valence, occurring around 300ms post-HEP. However, on the amplitude of HEP, positive and neutral emotions were similar, higher than negative emotions, suggesting dimensional singularities of emotional valence may not be supported and necessitating further subdivision of sub-dimensions or modification of the model's morphological features; 2) in terms of arousal, no significant HEP difference was noted between negative and stress emotions (same valence, but higher stress arousal), while stress emotions differed significantly from neutral and positive emotions in the right frontotemporal lobe HEP indices, within an 8-25ms post-HEP window. This suggests that comparisons of arousal might need to be based on the functionality of emotions (negative and stress emotions share valence but differ in functionality), possibly involving distinctions such as fight, flight, and affiliation orientations. Through the aforementioned studies, we explored the path from laboratory to real-world emotion recognition using interbeat interval. Laboratory models showed high predictive ability and, despite slightly reduced precision of data collection tools in real-world settings, large data volumes allowed for models to still exhibit substantial predictive capacity, indicating preliminary application value for predicting emotions in the real world. The studies explored the significance of different features, with SAI and PAI indices showing higher importance. By investigating the correlation between cardiac dynamics features and brain activity in emotional activities, the research enriches understanding of the physiological underpinnings of emotions, enhances model interpretability, and provides potential physiological bases for model improvement. Lastly, two directions were proposed in the current research for improvement based on a neurocardiac perspective on emotion composition theory, i.e., reshaping morphological features of the "valence-arousal" model and classifying emotions based on different functional orientations.
英文摘要近年来,情绪识别成为心理学和神经科学领域的研究热点之一。其中,心跳信号作为一种重要的生理指标,被广泛应用于情绪识别的研究中(心跳间期的动 力学特征在本研究中称为心跳动力学特征)。基于心跳动力学特征识别情绪的模 型总体存在以下不足:1、缺乏对于特征有效性的系统检验;2、缺乏在真实世界 中模型生态效度的考察;3、缺乏特征在情绪活动中与大脑活动关联性的理解;4、 缺乏对情绪构成理论的检验。针对上述问题,本研究开展 3 个研究,分别考察心 跳动力学特征用于情绪识别的有效性,在真实世界条件下检验模型的适用性,以 心脑交互角度考察心跳动力学特征与大脑活动的相关性,检验情绪维度理论。 研究 1 招募 64 名大学生被试,在不同情绪效价、唤醒度条件下采集 ECG 数 据,使用传统 HRV 指标与新兴的交感神经指数(SAI)、副交感神经指数(PAI) 相关特征建立情绪识别模型,在情绪效价四分类模型获得 82.00%、76.21%的准 确率,在情绪唤醒度二分类获得 93.33%、97.50%的准确率。发现 sai_nni_20、 pai_nni_50、pai_nni_20、HRV_CMSEn、sai_acf_lhalf、HRV_SI 是更具重要性的 特征。 研究 2a 招募 47 名大学生被试,比较基于 Photoplethysmography (PPG)数据 与基于 Electrocardiography (ECG)数据的心跳动力学特征预测情绪的效果差别。 基于 PPG 数据的模型在预测情绪效价四分类、唤醒度二分类最高准确率 78.65%、88.40%,低于基于 ECG 的数据,但总体预测效果尚佳,适用于真实世界应用。 在算法方面,深度学习模型展现出更好的预测能力。 研究 2b 在真实世界条件下招募 2687 名大学生,使用可穿戴腕部设备采集 PPG 信号,建立基于心跳动力学特征预测情绪的模型,对于情绪效价三分类最高 准确率 87.85%,情绪唤醒度二分类最高准确率 98.01%,由于真实世界数据量的 增大,基于 PPG 信号的模型预测能力显著提升。深度学习模型展现出显著优势。 研究 3a 分别基于 45 名、41 名大学生数据,使用 RSA 方法,同时采集 ECG 与 EEG 数据,考察在不同情绪效价、唤醒度条件下心跳动力学特征相关的脑区, 发现 3 种在不同情绪效价下典型的相关脑区,2 种在不同情绪唤醒度下典型的相 关脑区。特征的相关脑区之间存在互补性和组群化特点,体现了全脑在情绪活动 中的综合作用,组群化未来可能有助于模型改善。 研究 3b 被试同研究 3a,使用 HEP 指标,以心脑互动角度检验了情绪“效价- 唤醒度”模型,结果发现 1、不同情绪效价在额叶、枕叶和其他左半脑的 HEP 指 标产生显著差异,时间为 HEP 后 300ms 前后,但 HEP 的振幅上,正性与中性情 绪相似,高于负性情绪,情绪效价并未展现出单维度特性,未来可以考虑进一步 划分子维度或者改变模型的形态学特征;2、情绪唤醒度方面,负性情绪与应激情绪(效价相同,应激唤醒度高于负性)无显著 HEP 差异,而应激与中性、正 性情绪在右侧额颞叶 HEP 指标产生显著差异,时间窗口为 HEP 后 8-25ms,这一 结果说明唤醒度的比较可能需要基于情绪的功能性(负性与应激情绪虽然效价一 致但有不同的功能性),比如划分为战斗、退缩、亲和等不同取向。 通过上述 3 个研究,探索了使用心跳间期进行情绪识别的实验室到真实世界的路径,实验室模型展现出较高的预测能力,在真实世界中采集工具精度虽然略 低,但由于数据量的剧增,模型仍展现了较高的预测能力,初步展现在真实世界 预测情绪的应用性价值。研究中探索了不同特征的重要性,SAI、PAI 指标展现 出更高的重要性。探究中还探索了情绪活动中心跳动力特征与脑的相关性,丰富 了对于情绪生理基础的理解,增强了模型的可解释性,并为模型的提升提供了潜在的生理依据。最后研究根据心脑交互的视角对情绪构成理论提出了两个改进方向:改造情绪“效价-唤醒度”理论的形态学特征、依据不同的功能取向划分情绪类别。
语种中文
源URL[http://ir.psych.ac.cn/handle/311026/47981]  
专题心理研究所_应用研究版块
推荐引用方式
GB/T 7714
黄鑫. 心脑交互视角下基于心跳间期识别情绪[D]. 中国科学院心理研究所. 中国科学院大学. 2024.

入库方式: OAI收割

来源:心理研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。