中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Joint spatial temporal attention for action recognition

文献类型:期刊论文

作者Tingzhao Yu1,2; Chaoxu Guo1,2; Lingfeng Wang1; Huxiang Gu1; Shiming Xiang1; Chunhong Pan1
刊名Pattern Recognition Letters
出版日期2018
期号112页码:226-233
关键词Action Recognition Spatial-temporal Attention Two-stage
英文摘要

In this paper, we propose a novel high-level action representation using joint spatial-temporal attention model, with application to video-based human action recognition. Specifically, to extract robust motion representations of videos, a new spatial attention module based on 3D convolution is proposed, which can pay attention to the salient parts of the spatial areas. For better dealing with long-duration videos, a new bidirectional LSTM based temporal attention module is introduced, which aims to focus on the key video cubes instead of the key video frames of a given video. The spatial-temporal attention network can be jointly trained via a two-stage strategy, which enables us to simultaneously explore the correla- tion both in spatial and temporal domain. Experimental results on benchmark action recognition datasets demonstrate the effectiveness of our network.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/23618]  
专题自动化研究所_模式识别国家重点实验室_遥感图像处理团队
通讯作者Tingzhao Yu
作者单位1.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
2.School of Computer and Control Engineering, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Tingzhao Yu,Chaoxu Guo,Lingfeng Wang,et al. Joint spatial temporal attention for action recognition[J]. Pattern Recognition Letters,2018(112):226-233.
APA Tingzhao Yu,Chaoxu Guo,Lingfeng Wang,Huxiang Gu,Shiming Xiang,&Chunhong Pan.(2018).Joint spatial temporal attention for action recognition.Pattern Recognition Letters(112),226-233.
MLA Tingzhao Yu,et al."Joint spatial temporal attention for action recognition".Pattern Recognition Letters .112(2018):226-233.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。