中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Skeleton-Based Action Recognition with Shift Graph Convolutional Network

文献类型:会议论文

作者Ke Cheng2,3; Yifan Zhang2,3; Xiangyu He2,3; Weihan Chen2,3; Jian Cheng1,2,3; Hanqing Lu2,3
出版日期2020-06
会议日期June 2020
会议地点线上
英文摘要

Action recognition with skeleton data is attracting more attention in computer vision. Recently, graph convolutional networks (GCNs), which model the human body skeletons as spatiotemporal graphs, have obtained remarkable performance. However, the computational complexity of GCNbased methods are pretty heavy, typically over 15 GFLOPs for one action sample. Recent works even reach ∼100 GFLOPs. Another shortcoming is that the receptive fields of both spatial graph and temporal graph are inflexible. Although some works enhance the expressiveness of spatial graph by introducing incremental adaptive modules, their performance is still limited by regular GCN structures. In this paper, we propose a novel shift graph convolutional network (Shift-GCN) to overcome both shortcomings. Instead of using heavy regular graph convolutions, our Shift-GCN is composed of novel shift graph operations and lightweight point-wise convolutions, where the shift graph operations provide flexible receptive fields for both spatial graph and temporal graph. On three datasets for skeleton-based action recognition, the proposed Shift-GCN notably exceeds the state-of-the-art methods with more than 10× less computational complexity.

会议录出版者IEEE
源URL[http://ir.ia.ac.cn/handle/173211/44320]  
专题自动化研究所_模式识别国家重点实验室_图像与视频分析团队
类脑芯片与系统研究
通讯作者Yifan Zhang
作者单位1.CAS Center for Excellence in Brain Science and Intelligence Technology
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
3.NLPR & AIRIA, Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Ke Cheng,Yifan Zhang,Xiangyu He,et al. Skeleton-Based Action Recognition with Shift Graph Convolutional Network[C]. 见:. 线上. June 2020.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。