中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
DCFNet: Discriminant Correlation Filters Network for Visual Tracking

文献类型:期刊论文

作者Hu, Wei-Ming1; Wang, Qiang1; Gao, Jin1; Li, Bing1; Maybank, Stephen2
刊名JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY
出版日期2024-05-01
卷号39期号:3页码:691-714
关键词correlation filter convolutional neural network (CNN) visual tracking
ISSN号1000-9000
DOI10.1007/s11390-023-3788-3
通讯作者Hu, Wei-Ming(wmhu@nlpr.ia.ac.cn) ; Wang, Qiang(qiang.wang@nlpr.ia.ac.cn) ; Gao, Jin(jin.gao@nlpr.ia.ac.cn) ; Li, Bing(bli@nlpr.ia.ac.cn) ; Maybank, Stephen(sjmaybank@dcs.bbk.ac.uk)
英文摘要CNN (convolutional neural network) based real time trackers usually do not carry out online network update in order to maintain rapid tracking speed. This inevitably influences the adaptability to changes in object appearance. Correlation filter based trackers can update the model parameters online in real time. In this paper, we present an end-to-end lightweight network architecture, namely Discriminant Correlation Filter Network (DCFNet). A differentiable DCF (discriminant correlation filter) layer is incorporated into a Siamese network architecture in order to learn the convolutional features and the correlation filter simultaneously. The correlation filter can be efficiently updated online. In previous work, we introduced a joint scale-position space to the DCFNet, forming a scale DCFNet which carries out the predictions of object scale and position simultaneously. We combine the scale DCFNet with the convolutional-deconvolutional network, learning both the high-level embedding space representations and the low-level fine-grained representations for images. The adaptability of the fine-grained correlation analysis and the generalization capability of the semantic embedding are complementary for visual tracking. The back-propagation is derived in the Fourier frequency domain throughout the entire work, preserving the efficiency of the DCF. Extensive evaluations on the OTB (Object Tracking Benchmark) and VOT (Visual Object Tracking Challenge) datasets demonstrate that the proposed trackers have fast speeds, while maintaining tracking accuracy.
WOS关键词OBJECT TRACKING
资助项目National Key Research and Development Program of China[2020AAA0105802] ; National Key Research and Development Program of China[2020AAA0105800] ; National Natural Science Foundation of China[62036011] ; National Natural Science Foundation of China[62192782] ; National Natural Science Foundation of China[61721004] ; National Natural Science Foundation of China[U2033210] ; Beijing Natural Science Foundation[L223003]
WOS研究方向Computer Science
语种英语
WOS记录号WOS:001274075100002
出版者SPRINGER SINGAPORE PTE LTD
资助机构National Key Research and Development Program of China ; National Natural Science Foundation of China ; Beijing Natural Science Foundation
源URL[http://ir.ia.ac.cn/handle/173211/59339]  
专题自动化研究所_模式识别国家重点实验室_视频内容安全团队
通讯作者Hu, Wei-Ming; Wang, Qiang; Gao, Jin; Li, Bing; Maybank, Stephen
作者单位1.Chinese Acad Sci, Inst Automat, State Key Lab Multimodal Artificial Intelligence S, Beijing 100190, Peoples R China
2.Birkbeck Coll, Dept Comp Sci & Informat Syst, London WC1E 7HX, England
推荐引用方式
GB/T 7714
Hu, Wei-Ming,Wang, Qiang,Gao, Jin,et al. DCFNet: Discriminant Correlation Filters Network for Visual Tracking[J]. JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY,2024,39(3):691-714.
APA Hu, Wei-Ming,Wang, Qiang,Gao, Jin,Li, Bing,&Maybank, Stephen.(2024).DCFNet: Discriminant Correlation Filters Network for Visual Tracking.JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY,39(3),691-714.
MLA Hu, Wei-Ming,et al."DCFNet: Discriminant Correlation Filters Network for Visual Tracking".JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY 39.3(2024):691-714.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。