中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Moving target detection approach based on spatio-temporal salient perception

文献类型:期刊论文

作者Jin, Gang; Li, Zhengzhou; Gu, Yuanshan; Li, Jialing; Cao, Dong; Liu, Linyan
刊名OPTIK
出版日期2014
卷号125期号:22页码:6681-6686
ISSN号0030-4026
通讯作者Li, ZZ (reprint author), Chongqing Univ, Coll Commun Engn, Chongqing 400030, Peoples R China.
中文摘要The differences in texture and motion between man-made object and natural scene are the key features for human biological visual system to detect moving object in scenery. The paper proposed a moving target detection approach based on spatio-temporal perception, which is a crucial function of the visual attention mechanism. The spatial feature including edge, orientation, texture and contrast of the image are extracted, and then the corresponding spatial salient map are constructed by fusing the features through difference of Gaussian (DOG) function, which can suppress the common and enhance the difference of local region. Then, the global motion, local motion and relative motion between continuous images are extracted by means of pyramid multi-resolution, and the moving salient map is constructed after the motion difference between moving target and background is confirmed. Finally, the spatio-temporal salient map is constructed by fusing the spatial salient map and the moving salient map through competition strategy, and the moving target could be detected by searching the maximum in the spatio-temporal salient map. Some experiments are included and the results show that the method can accurately detect the moving target in complex background. (C) 2014 Elsevier GmbH. All rights reserved.
英文摘要The differences in texture and motion between man-made object and natural scene are the key features for human biological visual system to detect moving object in scenery. The paper proposed a moving target detection approach based on spatio-temporal perception, which is a crucial function of the visual attention mechanism. The spatial feature including edge, orientation, texture and contrast of the image are extracted, and then the corresponding spatial salient map are constructed by fusing the features through difference of Gaussian (DOG) function, which can suppress the common and enhance the difference of local region. Then, the global motion, local motion and relative motion between continuous images are extracted by means of pyramid multi-resolution, and the moving salient map is constructed after the motion difference between moving target and background is confirmed. Finally, the spatio-temporal salient map is constructed by fusing the spatial salient map and the moving salient map through competition strategy, and the moving target could be detected by searching the maximum in the spatio-temporal salient map. Some experiments are included and the results show that the method can accurately detect the moving target in complex background. (C) 2014 Elsevier GmbH. All rights reserved.
学科主题Moving target detection; Spatial salient maps; Motion salient map; Spatio-temporal salient map
收录类别SCI
语种英语
WOS记录号WOS:000344976300015
源URL[http://ir.ioe.ac.cn/handle/181551/4143]  
专题光电技术研究所_光电工程总体研究室(一室)
作者单位1.[Jin, Gang
2.Cao, Dong
3.Liu, Linyan] China Aerodynam Res & Dev Ctr, Mianyang 621000, Peoples R China
4.[Li, Zhengzhou
5.Gu, Yuanshan
6.Li, Jialing] Chongqing Univ, Coll Commun Engn, Chongqing 400030, Peoples R China
7.[Li, Zhengzhou] Chinese Acad Sci, Key Lab Beam Control, Chengdu 610209, Peoples R China
推荐引用方式
GB/T 7714
Jin, Gang,Li, Zhengzhou,Gu, Yuanshan,et al. Moving target detection approach based on spatio-temporal salient perception[J]. OPTIK,2014,125(22):6681-6686.
APA Jin, Gang,Li, Zhengzhou,Gu, Yuanshan,Li, Jialing,Cao, Dong,&Liu, Linyan.(2014).Moving target detection approach based on spatio-temporal salient perception.OPTIK,125(22),6681-6686.
MLA Jin, Gang,et al."Moving target detection approach based on spatio-temporal salient perception".OPTIK 125.22(2014):6681-6686.

入库方式: OAI收割

来源:光电技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。