中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Part-aligned pose-guided recurrent network for action recognition

文献类型:期刊论文

作者Huang, Linjiang1,2; Huang, Yan1,2; Ouyang, Wanli4; Wang, Liang1,2,3
刊名PATTERN RECOGNITION
出版日期2019-08-01
卷号92页码:165-176
关键词Action recognition Part alignment Auto-transformer attention
ISSN号0031-3203
DOI10.1016/j.patcog.2019.03,010
通讯作者Wang, Liang(wangliang@nlpr.ia.ac.cn)
英文摘要Action recognition using pose information has drawn much attention recently. However, most previous approaches treat human pose as a whole or just use pose to extract robust features. Actually, human body parts play an important role in action, and so modeling spatio-temporal information of body parts can effectively assist in classifying actions. In this paper, we propose a Part-aligned Pose-guided Recurrent Network ((PRN)-R-2) for action recognition. The model mainly consists of two modules, i.e., part alignment module and part pooling module, which are used for part representation learning and part-related feature fusion, respectively. The part-alignment module incorporates an auto-transformer attention, aiming to capture spatial configuration of body parts and predict pose attention maps. While the part pooling module exploits both symmetry and complementarity of body parts to produce fused body representation. The whole network is a recurrent network which can exploit the body representation and simultaneously model spatio-temporal evolutions of human body parts. Experiments on two publicly available benchmark datasets show the state-of-the-art performance and demonstrate the power of the two proposed modules. (C) 2019 Elsevier Ltd. All rights reserved.
WOS关键词REPRESENTATION ; HISTOGRAMS
资助项目National Key Research and Development Program of China[2016YFB1001000] ; National Natural Science Foundation of China[61525306] ; National Natural Science Foundation of China[61633021] ; National Natural Science Foundation of China[61721004] ; National Natural Science Foundation of China[61420106015] ; National Natural Science Foundation of China[61806194] ; Capital Science and Technology Leading Talent Training Project[Z181100006318030] ; Beijing Science and Technology Project[Z181100008918010]
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000468013000014
出版者ELSEVIER SCI LTD
资助机构National Key Research and Development Program of China ; National Natural Science Foundation of China ; Capital Science and Technology Leading Talent Training Project ; Beijing Science and Technology Project
源URL[http://ir.ia.ac.cn/handle/173211/24248]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Wang, Liang
作者单位1.Univ Chinese Acad Sci, Beijing, Peoples R China
2.NLPR, CRIPAC, Beijing, Peoples R China
3.Chinese Acad Sci CASIA, Inst Automat, Ctr Excellence Brain Sci & Intelligence Technol C, Beijing, Peoples R China
4.Univ Sydney, Sch Elect & Informat Engn, Sydney, NSW, Australia
推荐引用方式
GB/T 7714
Huang, Linjiang,Huang, Yan,Ouyang, Wanli,et al. Part-aligned pose-guided recurrent network for action recognition[J]. PATTERN RECOGNITION,2019,92:165-176.
APA Huang, Linjiang,Huang, Yan,Ouyang, Wanli,&Wang, Liang.(2019).Part-aligned pose-guided recurrent network for action recognition.PATTERN RECOGNITION,92,165-176.
MLA Huang, Linjiang,et al."Part-aligned pose-guided recurrent network for action recognition".PATTERN RECOGNITION 92(2019):165-176.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。