Decoding Electromyographic Signal With Multiple Labels for Hand Gesture Recognition
文献类型:期刊论文
作者 | Zou, Yongxiang1,3,4![]() ![]() ![]() ![]() |
刊名 | IEEE SIGNAL PROCESSING LETTERS
![]() |
出版日期 | 2023 |
卷号 | 30页码:483-487 |
关键词 | Feature extraction Gesture recognition Decoding Aggregates Muscles Hospitals Graph neural networks Electromyogram decoding graph neural network hand gesture recognition multiple labels |
ISSN号 | 1070-9908 |
DOI | 10.1109/LSP.2023.3264417 |
通讯作者 | Song, Luping(songluping882002@aliyun.com) |
英文摘要 | Surface electromyography (sEMG) is a significant interaction signal in the fields of human-computer interaction and rehabilitation assessment, as it can be used for hand gesture recognition. This letter proposes a novel MLHG model to improve the robustness of sEMG-based hand gesture recognition. The model utilizes multiple labels to decode the sEMG signals from two different perspectives. In the first view, the sEMG signals are transformed into motion signals using the proposed FES-MSCNN (Feature Extraction of sEMG with Multiple Sub-CNN modules). Furthermore, a discriminator FEM-SAGE (Feature Extraction of Motion with graph SAmple and aggreGatE model) is employed to judge the authenticity of the generated motion data. The deep features of the motion signals are extracted using the FEM-SAGE model. In the second view, the deep features of the sEMG signals are extracted using the FES-MSCNN model. The extracted features of the sEMG signals and the generated motion signals are then fused for hand gesture recognition. To evaluate the performance of the proposed model, a dataset containing sEMG signals and multiple labels from 12 subjects has been collected. The experimental results indicate that the MLHG model achieves an accuracy of 99.26% for within-session hand gesture recognition, 78.47% for cross-time, and 53.52% for cross-subject. These results represent a significant improvement compared to using only the gesture labels, with accuracy improvements of 1.91%, 5.35%, and 5.25% in the within-session, cross-time and cross-subject cases, respectively. |
资助项目 | National Key Research and Development Program of China[2022YFB4703204] ; CAS Project for Young Scientists in Basic Research[YSBR-034] |
WOS研究方向 | Engineering |
语种 | 英语 |
WOS记录号 | WOS:000982369900001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Key Research and Development Program of China ; CAS Project for Young Scientists in Basic Research |
源URL | [http://ir.ia.ac.cn/handle/173211/53281] ![]() |
专题 | 多模态人工智能系统全国重点实验室 |
通讯作者 | Song, Luping |
作者单位 | 1.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China 2.Huazhong Univ Sci & Technol Union Shenzhen, Shenzhen Peoples Hosp 6, Nanshan Hosp, Shenzhen 518172, Peoples R China 3.Chinese Acad Sci, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China 4.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Zou, Yongxiang,Cheng, Long,Han, Lijun,et al. Decoding Electromyographic Signal With Multiple Labels for Hand Gesture Recognition[J]. IEEE SIGNAL PROCESSING LETTERS,2023,30:483-487. |
APA | Zou, Yongxiang,Cheng, Long,Han, Lijun,Li, Zhengwei,&Song, Luping.(2023).Decoding Electromyographic Signal With Multiple Labels for Hand Gesture Recognition.IEEE SIGNAL PROCESSING LETTERS,30,483-487. |
MLA | Zou, Yongxiang,et al."Decoding Electromyographic Signal With Multiple Labels for Hand Gesture Recognition".IEEE SIGNAL PROCESSING LETTERS 30(2023):483-487. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。