中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
EEG-Based Evaluation of Aesthetic Experience Using BiLSTM Network

文献类型:期刊论文

作者Wang, Peishan7,8; Feng, Haibei5,6; Du, Xiaobing5,6; Nie, Rui4,8; Lin, Yudi3,6; Ma, Cuixia1,2,5; Zhang, Liang7,8
刊名International Journal of Human-Computer Interaction
出版日期2023
页码14
通讯作者邮箱zhangl@psych.ac.cn(张亮)
关键词EEG aesthetic experience deep learning physical product evaluation
ISSN号1044-7318
DOI10.1080/10447318.2023.2278926
通讯作者Zhang, Liang(zhangl@psych.ac.cn)
文献子类综述
英文摘要

Evaluation of aesthetic design fulfills a pivotal function in product development, which urges for an efficacious objective method to measure customers' experience. The stability and effectiveness of electroencephalography (EEG) make it a suitable tool for aesthetic experience measurement. Nevertheless, existing studies have several limitations, especially regarding the stimuli and the algorithm. The potential of an EEG-based deep learning model has not been verified in pinpointing subtle differences in physical product aesthetics. To fill the research gap in this issue, we recorded EEG signals in real-life scenarios when participants were presented with different types of physical smartphones, and asked participants to rate them from four dimensions of aesthetic experience (arousal, valence, likeness, and aesthetic evaluation). Then, the time-frequency data were fed into a spatial feature extraction network and an attention-based bidirectional long short-term memory (BiLSTM) optimized by the cross-entropy loss function. The result showed that at 16s window size, the four outcome models yielded the best joint recognition performance of aesthetic experience with an average accuracy of over 85% (arousal: 88.10%, valence: 87.97%, likeness: 85.99%, and aesthetic evaluation: 87.23%). It provides an objective cross-subject recognition method with multi-faceted evaluation results of aesthetic experience. Additionally, we verified the ability of EEG as a reliable and informative resource in terms of aesthetic experience evaluation, even with subtle differences. More practically, a future direction of incorporating EEG signals into subjective product aesthetics measurement could be given more credit.

收录类别SCI ; SSCI
WOS关键词VISUAL AESTHETICS ; EMOTION RECOGNITION ; PRODUCT ; PREFERENCE
资助项目National Natural Science Foundation of China
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:001122516900001
出版者TAYLOR & FRANCIS INC
资助机构National Natural Science Foundation of China
源URL[http://ir.psych.ac.cn/handle/311026/46594]  
专题心理研究所_中国科学院行为科学重点实验室
作者单位1.International Joint Laboratory of Artificial Intelligence and Emotional Interaction, Beijing Key Laboratory of Human-Computer Interactions, Beijing, China
2.State Key Laboratory of Computer Science, Institute of Software, Chinese Academy of Sciences, Beijing, China
3.Department of Computer Science, University of Southern California, Los Angeles; CA, United States
4.Department of Biostatistics, University of Michigan Ann Arbor, Ann Arbor; MI, United States
5.Department of Computer Science and Technology, University of Chinese Academy of Sciences, Beijing, China
6.Beijing Key Laboratory of Human-Computer Interactions, Institute of Software, Chinese Academy of Sciences, Beijing, China
7.Department of Psychology, University of Chinese Academy of Sciences, Beijing, China
8.Key Laboratory of Behavioral Science, Institute of Psychology, Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Wang, Peishan,Feng, Haibei,Du, Xiaobing,et al. EEG-Based Evaluation of Aesthetic Experience Using BiLSTM Network[J]. International Journal of Human-Computer Interaction,2023:14.
APA Wang, Peishan.,Feng, Haibei.,Du, Xiaobing.,Nie, Rui.,Lin, Yudi.,...&Zhang, Liang.(2023).EEG-Based Evaluation of Aesthetic Experience Using BiLSTM Network.International Journal of Human-Computer Interaction,14.
MLA Wang, Peishan,et al."EEG-Based Evaluation of Aesthetic Experience Using BiLSTM Network".International Journal of Human-Computer Interaction (2023):14.

入库方式: OAI收割

来源:心理研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。