中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Relational Learning for Joint Head and Human Detection

文献类型:会议论文

作者Cheng Chi1,3; Shifeng Zhang2,3; Junliang Xing2,3; Zhen Lei2,3; Stan Z. Li2,3; Xudong Zou1,3; Lei, Zhen; Li, Stan Z.; Xing, Junliang; Zhang, Shifeng
出版日期2020
会议日期2020-02
会议地点美国纽约
英文摘要

Head and human detection have been rapidly improved with the development of deep convolutional neural networks. However, these two tasks are often studied separately without considering their inherent correlation, leading to that 1) head detection is often trapped in more false positives, and 2) the performance of human detector frequently drops dramatically in crowd scenes. To handle these two issues, we present a novel joint head and human detection network, namely JointDet, which effectively detects head and human body simultaneously. Moreover, we design a head-body relationship discriminating module to perform relational learning between heads and human bodies, and leverage this learned relationship to regain the suppressed human detections and reduce head false positives. To verify the effectiveness of the proposed method, we annotate head bounding boxes of the CityPersons and Caltech-USA datasets, and conduct extensive experiments on the CrowdHuman, CityPersons and Caltech-USA datasets. As a consequence, the proposed JointDet detector achieves state-of-the-art performance on these three benchmarks. To facilitate further studies on the head and human detection problem, all new annotations, source codes and trained models will be public.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/39044]  
专题自动化研究所_模式识别国家重点实验室_生物识别与安全技术研究中心
作者单位1.Institute of Automation Chinese Academy of Sciences
2.University of Chinese Academy of Sciences
3.Aerospace Information Research Institute Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Cheng Chi,Shifeng Zhang,Junliang Xing,et al. Relational Learning for Joint Head and Human Detection[C]. 见:. 美国纽约. 2020-02.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。