Human Action Recognition Using Local Spatio-Temporal Discriminant Embedding
文献类型:会议论文
作者 | Kui Jia ; Dit-Yan Yeung |
出版日期 | 2008 |
会议名称 | 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR, 2008 |
英文摘要 | Human action video sequences can be considered as nonlinear dynamic shape manifolds in the space of image frames. In this paper, we address learning and classifying human actions on embedded low-dimensional manifolds. We propose a novel manifold embedding method, called Local Spatio-Temporal Discriminant Embedding (LSTDE). The discriminating capabilities of the proposed method are two-fold: (1) for localspatial discrimination, LSTDE projects data points (silhouette-based image frames of human action sequences) in a local neighborhood into theembedding space where data points of the same action class are close while those of different classes are far apart; (2) in such a localneighborhood, each data point has an associated short video segment, which forms a local temporal subspace on the embedded manifold. LSTDE finds an optimal embedding which maximizes the principal angles between those temporal subspaces associated with data points of different classes. Benefiting from the joint spatio-temporal discriminant embedding, our method is potentially more powerful for classifying humanactions with similar space-time shapes, and is able to perform recognition on a frame-by-frame or short video segment basis. Experimental results demonstrate that our method can accurately recognize human actions, and can improve the recognition performance over some representative manifold embedding methods, especially on highly confusing human action types |
收录类别 | EI |
语种 | 英语 |
源URL | [http://ir.siat.ac.cn:8080/handle/172644/2214] ![]() |
专题 | 深圳先进技术研究院_集成所 |
推荐引用方式 GB/T 7714 | Kui Jia,Dit-Yan Yeung. Human Action Recognition Using Local Spatio-Temporal Discriminant Embedding[C]. 见:26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR, 2008. |
入库方式: OAI收割
来源:深圳先进技术研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。