PDNet: Toward Better One-Stage Object Detection With Prediction Decoupling
文献类型:期刊论文
作者 | Yang, Li4,5![]() ![]() ![]() ![]() ![]() ![]() ![]() |
刊名 | IEEE TRANSACTIONS ON IMAGE PROCESSING
![]() |
出版日期 | 2022 |
卷号 | 31页码:5121-5133 |
关键词 | Object detection prediction decoupling convolutional neural network |
ISSN号 | 1057-7149 |
DOI | 10.1109/TIP.2022.3193223 |
通讯作者 | Yuan, Chunfeng(cfyuan@nlpr.ia.ac.cn) |
英文摘要 | Recent one-stage object detectors follow a per-pixel prediction approach that predicts both the object category scores and boundary positions from every single grid location. However, the most suitable positions for inferring different targets, i.e., the object category and boundaries, are generally different. Predicting all these targets from the same grid location thus may lead to sub-optimal results. In this paper, we analyze the suitable inference positions for object category and boundaries, and propose a prediction-target-decoupled detector named PDNet to establish a more flexible detection paradigm. Our PDNet with the prediction decoupling mechanism encodes different targets separately in different locations. A learnable prediction collection module is devised with two sets of dynamic points, i.e., dynamic boundary points and semantic points, to collect and aggregate the predictions from the favorable regions for localization and classification. We adopt a two-step strategy to learn these dynamic point positions, where the prior positions are estimated for different targets first, and the network further predicts residual offsets to the positions with better perceptions of the object properties. Extensive experiments on the MS COCO benchmark demonstrate the effectiveness and efficiency of our method. With a single ResNeXt-64x4d-101-DCN as the backbone, our detector achieves 50.1 AP with single-scale testing, which outperforms the state-of-the-art methods by an appreciable margin under the same experimental settings. Moreover, our detector is highly efficient as a one-stage framework. Our code is public at https://github.com/yangli18/PDNet. |
资助项目 | National Key Research and Development Program of China[2020AAA0106800] ; Beijing Natural Science Foundation[JQ21017] ; Beijing Natural Science Foundation[4224091] ; National Natural Science Foundation of China[61972397] ; National Natural Science Foundation of China[62036011] ; National Natural Science Foundation of China[62192782] ; National Natural Science Foundation of China[61721004] ; National Natural Science Foundation of China[61906192] ; Key Research Program of Frontier Sciences, CAS[QYZDJ-SSW-JSC040] ; China Postdoctoral Science Foundation[2021M693402] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000835774000011 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Key Research and Development Program of China ; Beijing Natural Science Foundation ; National Natural Science Foundation of China ; Key Research Program of Frontier Sciences, CAS ; China Postdoctoral Science Foundation |
源URL | [http://ir.ia.ac.cn/handle/173211/49821] ![]() |
专题 | 自动化研究所_模式识别国家重点实验室_视频内容安全团队 |
通讯作者 | Yuan, Chunfeng |
作者单位 | 1.CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China 2.PeopleAI Inc, Beijing 100190, Peoples R China 3.Chinese Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China 4.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100190, Peoples R China 5.Chinese Acad Sci, Natl Lab Pattern Recognit, Inst Automat, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Yang, Li,Xu, Yan,Wang, Shaoru,et al. PDNet: Toward Better One-Stage Object Detection With Prediction Decoupling[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2022,31:5121-5133. |
APA | Yang, Li.,Xu, Yan.,Wang, Shaoru.,Yuan, Chunfeng.,Zhang, Ziqi.,...&Hu, Weiming.(2022).PDNet: Toward Better One-Stage Object Detection With Prediction Decoupling.IEEE TRANSACTIONS ON IMAGE PROCESSING,31,5121-5133. |
MLA | Yang, Li,et al."PDNet: Toward Better One-Stage Object Detection With Prediction Decoupling".IEEE TRANSACTIONS ON IMAGE PROCESSING 31(2022):5121-5133. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。