A wearable-HAR oriented sensory data generation method based on spatio-temporal reinforced conditional GANs
文献类型:期刊论文
作者 | Wang, Jiwei1,2,3; Chen, Yiqiang1,2; Gu, Yang1,2 |
刊名 | NEUROCOMPUTING
![]() |
出版日期 | 2022-07-07 |
卷号 | 493页码:548-567 |
关键词 | Conditional SensoryGANs Spatial-temporal features Wearable-HAR Log-cosh based adversarial loss Cosine similarity Qualitative visual evaluations Quantitative evaluations |
ISSN号 | 0925-2312 |
DOI | 10.1016/j.neucom.2021.12.097 |
英文摘要 | Human activity recognition based on wearable sensors plays an essential role in promoting many practical applications, such as healthcare, motion monitoring, medical examination, anomaly detection and human-computer interaction. It's worth noting that longer temporal sensory sequences could reflect the characteristics of different daily activities more accurately. However, existing GANs-based time series generation methods could only synthesize uniaxial, multivariate or multidimensional sensor data over a relatively short span of time. These shorter synthetic time series could not effectively represent at least one complete daily activity cycle. To synthesize longer and more realistic multi-axial sensor data, this paper proposes a new customized GANs-based sensory data synthesizing method, which is dedicated to wearable activity recognition tasks, named Conditional SensoryGANs. Firstly, the elaborately designed MultiScale MultiDimensional (MSMD) spatiotemporal function module endows the proposed Conditional SensoryGANs with the capability of synthesizing longer sensory sequences, which could better characterize different behaviors with periodicity. Secondly, benefited from the well-designed Time-Frequency Enhancement (TFE) functional module, Conditional SensoryGANs could more accurately capture each axis's spatiotemporal property and spatial correlation between different axes to improve the fidelity of synthetic sensor data. Thirdly, Conditional SensoryGANs could synthesize verisimilar wearable sensor data of the specified quantity and category under a unified framework with the embedded condition's refined control. Qualitative visual evaluations demonstrate that the proposed method has more excellent capability for synthesizing verisimilar wearable multi-axial sensor data than the state-of-the-art GANbased sensor data generation methods. Quantitative experiments also prove that it could achieve better results than off-the-shelf GANs-based time series methods for synthesizing wearable multi-axial sensor data. Meanwhile, empirical results demonstrate that synthetic sensor data from Conditional SensoryGANs can achieve comparatively approximate usability in the field of wearable human activity recognition than the real sensor data.(c) 2021 Elsevier B.V. All rights reserved. |
资助项目 | Key-Area Research and Development Program of Guangdong Province[2019B010109001] ; Natural Science Foundation of China[61902377] ; Natural Science Foundation of China[61972383] ; Natural Science Foundation of China[62101530] ; Youth Innovation Promotion Association CAS |
WOS研究方向 | Computer Science |
语种 | 英语 |
WOS记录号 | WOS:000800351800008 |
出版者 | ELSEVIER |
源URL | [http://119.78.100.204/handle/2XEOYT63/19608] ![]() |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Chen, Yiqiang |
作者单位 | 1.Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China 2.Chinese Acad Sci, Inst Comp Technol, Beijing, Peoples R China 3.Univ Chinese Acad Sci, Beijing, Peoples R China |
推荐引用方式 GB/T 7714 | Wang, Jiwei,Chen, Yiqiang,Gu, Yang. A wearable-HAR oriented sensory data generation method based on spatio-temporal reinforced conditional GANs[J]. NEUROCOMPUTING,2022,493:548-567. |
APA | Wang, Jiwei,Chen, Yiqiang,&Gu, Yang.(2022).A wearable-HAR oriented sensory data generation method based on spatio-temporal reinforced conditional GANs.NEUROCOMPUTING,493,548-567. |
MLA | Wang, Jiwei,et al."A wearable-HAR oriented sensory data generation method based on spatio-temporal reinforced conditional GANs".NEUROCOMPUTING 493(2022):548-567. |
入库方式: OAI收割
来源:计算技术研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。