中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
OS-SSVEP: One-shot SSVEP classification

文献类型:期刊论文

作者Deng, Yang1,2; Ji, Zhiwei3; Wang, Yijun4; Zhou, S. Kevin1,2,5,6
刊名NEURAL NETWORKS
出版日期2024-12-01
卷号180页码:12
关键词Brain-computer interface (BCI) Steady-state visual evoked potential (SSVEP) One-shot classification Transfer learning Data augmentation
ISSN号0893-6080
DOI10.1016/j.neunet.2024.106734
英文摘要It is extremely challenging to classify steady-state visual evoked potentials (SSVEPs) in scenarios characterized by a huge scarcity of calibration data where only one calibration trial is available for each stimulus target. To address this challenge, we introduce a novel approach named OS-SSVEP, which combines a dual domain cross- subject fusion network (CSDuDoFN) with the task-related and task-discriminant component analysis (TRCA and TDCA) based on data augmentation. The CSDuDoFN framework is designed to comprehensively transfer information from source subjects, while TRCA and TDCA are employed to exploit the information from the single available calibration trial of the target subject. Specifically, CSDuDoFN uses multi-reference least-squares transformation (MLST) to map data from both the source subjects and the target subject into the domain of sine-cosine templates, thereby reducing cross-subject domain gap and benefiting transfer learning. In addition, CSDuDoFN is fed with both transformed and original data, with an adequate fusion of their features occurring at different network layers. To capitalize on the calibration trial of the target subject, OS-SSVEP utilizes source aliasing matrix estimation (SAME)-based data augmentation to incorporate into the training process of the ensemble TRCA (eTRCA) and TDCA models. Ultimately, the outputs of CSDuDoFN, eTRCA, and TDCA are combined for the SSVEP classification. The effectiveness of our proposed approach is comprehensively evaluated on three publicly available SSVEP datasets, achieving the best performance on two datasets and competitive performance on the third. Further, it is worth noting that our method follows a different technical route from the current state-of-the-art (SOTA) method and the two are complementary. The performance is significantly improved when our method is combined with the SOTA method. This study underscores the potential to integrate the SSVEP-based brain-computer interface (BCI) into daily life. The corresponding source code is accessible at https://github.com/Sungden/One-shot-SSVEP-classification.
资助项目National Key R&D Program of China[2022YFF1202303] ; National Natural Science Foundation of China[62071447] ; Project of Jiangsu Province Science and Technology Plan Special Fund[BE2022064-1] ; National Natural Science Foundation of China[62271465] ; Suzhou Basic Research Program[SYG202338] ; Open Fund Project of Guangdong Academy of Medical Sciences, China[YKY-KF202206]
WOS研究方向Computer Science ; Neurosciences & Neurology
语种英语
WOS记录号WOS:001327499800001
出版者PERGAMON-ELSEVIER SCIENCE LTD
源URL[http://119.78.100.204/handle/2XEOYT63/39547]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Yijun; Zhou, S. Kevin
作者单位1.Univ Sci & Technol China, Sch Biomed Engn, Div Life Sci & Med, Hefei 230026, Anhui, Peoples R China
2.Univ Sci & Technol China, Suzhou Inst Adv Res, Ctr Med Imaging Robot Analyt Comp & Learning MIRAC, Suzhou 215123, Jiangsu, Peoples R China
3.Nanjing Agr Univ, Coll Artificial Intelligence, Nanjing 210095, Jiangsu, Peoples R China
4.Chinese Acad Sci, Inst Semicond, State Key Lab Integrated Optoelect, Beijing 100083, Peoples R China
5.Univ Sci & Technol China, Key Lab Precis & Intelligent Chem, Hefei 230026, Anhui, Peoples R China
6.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Deng, Yang,Ji, Zhiwei,Wang, Yijun,et al. OS-SSVEP: One-shot SSVEP classification[J]. NEURAL NETWORKS,2024,180:12.
APA Deng, Yang,Ji, Zhiwei,Wang, Yijun,&Zhou, S. Kevin.(2024).OS-SSVEP: One-shot SSVEP classification.NEURAL NETWORKS,180,12.
MLA Deng, Yang,et al."OS-SSVEP: One-shot SSVEP classification".NEURAL NETWORKS 180(2024):12.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。