Task Decoupled Knowledge Distillation For Lightweight Face Detectors
文献类型:会议论文
作者 | Liang XQ(梁孝庆)![]() ![]() ![]() ![]() ![]() |
出版日期 | 2020-10 |
会议日期 | October 2020 |
会议地点 | 美国西雅图(线上) |
英文摘要 | Face detection is a hot topic in computer vision. The face detectionmethods usually consist of two subtasks, i.e. the classification sub-task and the regression subtask, which are trained with differentsamples. However, current face detection knowledge distillationmethods usually couple the two subtasks, and use the same set ofsamples in the distillation task. In this paper, we propose a taskdecoupled knowledge distillation method, which decouples the de-tection distillation task into two subtasks and uses different samplesin distilling the features of different subtasks. We firstly propose afeature decoupling method to decouple the classification featuresand the regression features, without introducing any extra calcula-tions at inference time. Specifically, we generate the correspondingfeatures by adding task-specific convolutions in the teacher net-work and adding adaption convolutions on the feature maps ofthe student network. Then we select different samples for differentsubtasks to imitate. Moreover, we also propose an effective proba-bility distillation method to joint boost the accuracy of the studentnetwork. We apply our distillation method on a lightweight face de-tector, EagleEye[37]. Experimental results show that the proposedmethod effectively improves the student detector’s accuracy by5.1%, 5.1%, and 2.8% AP in Easy, Medium, Hard subsets respectively. |
语种 | 英语 |
源URL | [http://ir.ia.ac.cn/handle/173211/51515] ![]() |
专题 | 自动化研究所_模式识别国家重点实验室_图像与视频分析团队 紫东太初大模型研究中心 |
通讯作者 | Liang XQ(梁孝庆) |
作者单位 | 中国科学院自动化研究所 |
推荐引用方式 GB/T 7714 | Liang XQ,Zhao X,Zhao CY,et al. Task Decoupled Knowledge Distillation For Lightweight Face Detectors[C]. 见:. 美国西雅图(线上). October 2020. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。