中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Deep Spatial and Temporal Network for Robust Visual Object Tracking

文献类型:期刊论文

作者Teng, Zhu1; Xing, Junliang2; Wang, Qiang2; Zhang, Baopeng1; Fan, Jianping3
刊名IEEE TRANSACTIONS ON IMAGE PROCESSING
出版日期2020
卷号29页码:1762-1775
关键词Target tracking Visualization Biological system modeling Correlation Training Benchmark testing Visual tracking deep network spatial-temporal LSTM
ISSN号1057-7149
DOI10.1109/TIP.2019.2942502
通讯作者Xing, Junliang(jlxing@nlpr.ia.ac.cn)
英文摘要There are two key components that can be leveraged for visual tracking: (a) object appearances; and (b) object motions. Many existing techniques have recently employed deep learning to enhance visual tracking due to its superior representation power and strong learning ability, where most of them employed object appearances but few of them exploited object motions. In this work, a deep spatial and temporal network (DSTN) is developed for visual tracking by explicitly exploiting both the object representations from each frame and their dynamics along multiple frames in a video, such that it can seamlessly integrate the object appearances with their motions to produce compact object appearances and capture their temporal variations effectively. Our DSTN method, which is deployed into a tracking pipeline in a coarse-to-fine form, can perceive the subtle differences on spatial and temporal variations of the target (object being tracked), and thus it benefits from both off-line training and online fine-tuning. We have also conducted our experiments over four largest tracking benchmarks, including OTB-2013, OTB-2015, VOT2015, and VOT2017, and our experimental results have demonstrated that our DSTN method can achieve competitive performance as compared with the state-of-the-art techniques. The source code, trained models, and all the experimental results of this work will be made public available to facilitate further studies on this problem.
资助项目Natural Science Foundation of China[61972027] ; Natural Science Foundation of China[61672519] ; Natural Science Foundation of China[61872035] ; Fundamental Research Funds for the Central Universities of China[2019JBM022]
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000501324900008
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
资助机构Natural Science Foundation of China ; Fundamental Research Funds for the Central Universities of China
源URL[http://ir.ia.ac.cn/handle/173211/29339]  
专题智能系统与工程
通讯作者Xing, Junliang
作者单位1.Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing 100044, Peoples R China
2.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
3.Univ North Carolina Charlotte, Dept Comp Sci, Charlotte, NC 28223 USA
推荐引用方式
GB/T 7714
Teng, Zhu,Xing, Junliang,Wang, Qiang,et al. Deep Spatial and Temporal Network for Robust Visual Object Tracking[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2020,29:1762-1775.
APA Teng, Zhu,Xing, Junliang,Wang, Qiang,Zhang, Baopeng,&Fan, Jianping.(2020).Deep Spatial and Temporal Network for Robust Visual Object Tracking.IEEE TRANSACTIONS ON IMAGE PROCESSING,29,1762-1775.
MLA Teng, Zhu,et al."Deep Spatial and Temporal Network for Robust Visual Object Tracking".IEEE TRANSACTIONS ON IMAGE PROCESSING 29(2020):1762-1775.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。