Recurrent Metric Networks and Batch Multiple Hypothesis for Multi-Object Tracking
文献类型:期刊论文
作者 | LONGTAO CHEN; XIAOJIANG PENG; MINGWU REN |
刊名 | IEEE ACCESS
![]() |
出版日期 | 2018 |
文献子类 | 期刊论文 |
英文摘要 | Multi-object tracking aims to recover object trajectories given multiple detections in video frames. Object feature extraction and similarity metric are two keys to reliably associate trajectories. In this paper, we propose the recurrent metric network (RMNet), a CNN-RNN based similarity metric framework for multi-object tracking. Given a reference object, the RMNet takes as input random positive andnegativedetectionsandoutputssimilarityscoresovertime.TheRMNethandlesthelong-termtemporal object variations and false object detections by its long-short memory units. With the scores from RMNet, we introduce a batch multiple hypothesis (BMH) strategy, a simple yet efficient data association method for batch multi-object tracking. BMH generates a hypothesis tree for each object with a dual-threshold hypothesisgenerationapproach,andthenselectsthebestbranch(orhypothesis)foreachobjectasthebatch tracking result. Specially, we model hypothesis selection as a 0-1 programming problem and introduce a rewardfunctiontore-findobjectsincaseofmissingdetection.WeevaluateourRMNetandBMHstrategyon severalpopulardatasets:2DMOT2015,PETS2009,TUD,andKITTI.Weachieveperformancecomparable or superior to these of the state-of-the-art methods. |
URL标识 | 查看原文 |
语种 | 英语 |
源URL | [http://ir.siat.ac.cn:8080/handle/172644/13472] ![]() |
专题 | 深圳先进技术研究院_集成所 |
推荐引用方式 GB/T 7714 | LONGTAO CHEN,XIAOJIANG PENG,MINGWU REN. Recurrent Metric Networks and Batch Multiple Hypothesis for Multi-Object Tracking[J]. IEEE ACCESS,2018. |
APA | LONGTAO CHEN,XIAOJIANG PENG,&MINGWU REN.(2018).Recurrent Metric Networks and Batch Multiple Hypothesis for Multi-Object Tracking.IEEE ACCESS. |
MLA | LONGTAO CHEN,et al."Recurrent Metric Networks and Batch Multiple Hypothesis for Multi-Object Tracking".IEEE ACCESS (2018). |
入库方式: OAI收割
来源:深圳先进技术研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。