中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
情绪维度对外显视觉注意的分离调控及自我运动感觉中眼球位置-前庭信号的贝叶斯整合

文献类型:学位论文

作者倪剑光
学位类别硕士
答辩日期2011-05
授予单位中国科学院研究生院
授予地点北京
导师胡新天
关键词自我运动 眼球位置 前庭 Arosual 运动方向感觉 贝叶斯模型
其他题名Dissociable Modulation of Overt Visual Attention by Affective Dimensions & Bayesian Integration of Ocular Position – Vestibular Signals in Self-motion Heading Perception
学位专业神经生物学
中文摘要第一部分:情绪刺激关乎生物的生存与繁衍,具有重要的进化意义;因此,这类刺激容易被注意捕获并得到大脑的优先处理。 理论上, 情绪空间的两个主要维度,愉悦度(刺激的愉快程度)和唤醒度(刺激引发的情绪强度),已经被报道在人类的嗅觉,味觉及记忆系统中具有分离的神经基础。然而,这二个维度对外显视觉注意是否存在调制的分离目前没有明确的结论。本研究,我们将探讨情绪刺激的愉悦度和唤醒度对外显视觉注意的可能调控机制。 21名健康实验参与者在实验中自由浏览从国际情绪图片库(IAPS)中挑选的情绪图片,他们的眼球运动由眼动跟踪系统所监控。情绪图片按情绪评分值被划分为愉悦度组和唤醒度组共两个刺激组(Block)。我们引入代数图论的方法,将眼动扫视轨迹建模为无向加权路径图,考察扫视轨迹的全局拓扑结构。结果表明,被试浏览不同情绪值的图片生成的眼动轨迹模式显著不同。愉悦度组的图片引发较快的扫视,以及更大范围的注意力分布;唤醒度组的图片引发较为局部的扫视和更密集的注意力分布。我们的模型进一步揭示,愉悦度对外显视觉注意的调制是线性的,即愉悦度对注意力的影响随愉悦度值线性变化;唤醒度对注意力的调制是非线性的,即唤醒度对外显视觉注意的影响与唤醒度值成二次函数关系。本研究表明,情绪空间的不同维度对外显视觉注意的调控机制存在显著差异,暗示了愉悦度和唤醒度在视觉系统中神经基础的分离与计算法则的差别。 第二部分:人类和动物在环境中的自由活动离不开运动方向感觉(Heading)。大量研究表明Heading对运动的协调和安排具有重要作用,大脑对Heading的准确估计需要整合多模态与多通道的感觉信息,其中最重要的包括视觉信号,前庭系统信号(旋转、线性加速度及重力方向)及躯体的本体感觉信号,还可能与眼球位置信号有关。眼球位置信号由眼动系统运动命令拷贝(Efference Copy)和眼球的本体感觉决定,在视觉目标的空间定位,双眼视觉的方向,视觉稳定性等方面具有重要作用。本研究主要考察没有视觉输入的情况下,眼球位置信号与前庭耳石器信号在线性平移自我运动中对Heading估计的影响及大脑可能采用的计算法则。 我们使用双选项强迫选择(2AFC)实验范式作为行为任务。实验参与者(N=19)在黑暗的环境中笔直坐于运动模拟器内被动向前平移运动(运动方向与正前方的角度随机为两度或五度。 2只LED管(直径约2毫米)用于视觉注视点,固定在运动平台正前方等距的两侧,被试需要在运动的过程中盯住出现的LED,眼球将偏离眼眶中央约16度。被试在运动停止后需尽快按键报告运动的方向(相对于正前方偏左或者偏右)。2AFC结果表明,被试报告的Heading随眼球位置的不同发生显著移动,移动的方向与眼球偏移眼窝中央的方向一致,平均增益约为0.2。我们提出了两个理论模型解释眼球位置-前庭信号的相互作用用于大脑对Heading的估计。在理论模型基础上,发展了基于贝叶斯法则的数学模型讨论眼球位置和前庭信号在Heading感觉中各自可能的相对贡献,并由模型的眼球位置贡献比参数准确预测了Heading移动的大小。本研究表明眼球位置信息在Heading估计中的重要作用,为其在感觉运动整合中的作用提供了新证据。
英文摘要Part I: Emotional stimuli have evolutionary significance for the survival and reproduction of organisms; therefore, they are attention-grabbing and are processed preferentially. The neural underpinnings of two principle emotional dimensions in affective space, valence (degree of pleasantness) and arousal (intensity of evoked emotion), have been shown to be dissociable in the olfactory, gustatory and memory systems. However, the separable roles of valence and arousal in scene perception are poorly understood. In this study, we asked how these two emotional dimensions modulate overt visual attention. Twenty-two healthy volunteers freely viewed images from the International Affective Picture System (IAPS) that were graded for affective levels of valence and arousal (high, medium, and low). Subjects’ heads were immobilized and eye movements were recorded by camera to track overt shifts of visual attention. Algebraic graph-based approaches were introduced to model scan paths as weighted undirected path graphs, generating global topology metrics that characterize the algebraic connectivity of scan paths. Our data suggest that human subjects show different scanning patterns to stimuli with different affective ratings. Valence salientstimuli (with neutral arousal) elicited faster and larger shifts of attention, while arousal salient stimuli (with neutral valence) elicited local scanning, dense attention allocation and deep processing. Furthermore, our model revealed that the modulatory effect of valence was linearly related to the valence level, whereas the relation between the modulatory effect and the level of arousal was nonlinear. Hence, visual attention seems to be modulated by mechanisms that are separate for valence and arousal. Part II. Human and animals navigate freely in environment by the guide of motion direction. Heading, the perception of motion direction, which plays a critical role in movement planning and coordination, is computed by brain integrating of multimodal sensory cues, including retinal visual input, vestibular signals for rotational、translational and gravitational acceleration, somatosensory proprioceptive signals and ocular-position information. The Ocular position information, which is determined by the Efference Copy and ocular-motor proprioception, has been reported influential roles in visual object location, visual direction and stability. This paper investigated the roles of ocular position signals and otoliths signals in heading perception without visual input by passively translated self-motion and possible computational logics adopted by the brain. We employed a Two-Alternative-Forced-Choice (2AFC) paradigm as behavior task. Participants (N=19) was passively translated in a motion simulator in dark, in randomized directions (2o,5o,-2o,-5o) with respect to straight-ahead. Two LEDs (Diameter = 2mm), which were used as fixation cue, were anchored at the motion simulator on the both side of straight-ahead direction with the same distance. During the motion, participant has to fixate on the LED, thereby deviating the eye position approximately 16 degrees. They have to report their perceived heading respect to straight-ahead as soon as possible when the motion terminated. Results of 2AFC showed heading perception was shifted to the direction of eccentric eye position, with a gain around 0.2. We proposed two theoretical models to elucidate the interaction of ocular position-vestibular signals underlying heading shifts. Then a Bayesian based mathematical model was developed to separate their individual contribution in heading estimation and accurately predicted the heading shifts by an ocular position contribution coefficient.
语种中文
公开日期2013-04-23
源URL[http://159.226.149.42:8088/handle/152453/7376]  
专题昆明动物研究所_神经系统编码
推荐引用方式
GB/T 7714
倪剑光. 情绪维度对外显视觉注意的分离调控及自我运动感觉中眼球位置-前庭信号的贝叶斯整合[D]. 北京. 中国科学院研究生院. 2011.

入库方式: OAI收割

来源:昆明动物研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。