中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Large Batch Optimization for Object Detection: Training COCO in 12 Minutes

文献类型:会议论文

作者Wang, Tong2,5; Zhu, Yousong2,4; Chaoyang, Zhao2; Zeng, Wei1,6; Wang, Yaowei6; Wang, Jinqiao2,3,5; Tang, Ming2; Zhao, Chaoyang
出版日期2020-08
会议日期2020-8-24
会议地点Online
英文摘要

Most of existing object detectors usually adopt a small training batch size (e.g. 16), which severely hinders the whole community
from exploring large-scale datasets due to the extremely long training
procedure. In this paper, we propose a versatile large batch optimization framework for object detection, named LargeDet, which successfully
scales the batch size to larger than 1K for the first time. Specifically,
we present a novel Periodical Moments Decay LAMB (PMD-LAMB)
algorithm to effectively reduce the negative effects of the lagging historical gradients. Additionally, the Synchronized Batch Normalization
(SyncBN) is utilized to help fast convergence. With LargeDet, we can not
only prominently shorten the training period, but also significantly improve the detection accuracy of sparsely annotated large-scale datasets.
For instance, we can finish the training of ResNet50 FPN detector on
COCO within 12 minutes. Moreover, we achieve 12.2% mAP@0.5 absolute improvement for ResNet50 FPN on Open Images by training with
batch size 640.

源URL[http://ir.ia.ac.cn/handle/173211/47416]  
专题自动化研究所_模式识别国家重点实验室_图像与视频分析团队
作者单位1.Peking University, Beijing, China
2.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences, Beijing, China
3.NEXWISE Co., Ltd, Guangzhou, China
4.ObjectEye Inc., Beijing, China
5.School of Articial Intelligence, University of Chinese Academy of Sciences, Beijing, China
6.Peng Cheng Laboratory, Shenzhen, China
推荐引用方式
GB/T 7714
Wang, Tong,Zhu, Yousong,Chaoyang, Zhao,et al. Large Batch Optimization for Object Detection: Training COCO in 12 Minutes[C]. 见:. Online. 2020-8-24.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。