中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Face Model Compression by Distilling Knowledge from Neurons

文献类型:会议论文

作者Ping Luo; Zhenyao Zhu; Ziwei Liu; Xiaogang Wang; Xiaoou Tang
出版日期2016
会议名称AAAI 2016
会议地点美国
英文摘要The recent advanced face recognition systems were built on large Deep Neural Networks (DNNs) or their ensembles, which have millions of parameters. However, the expensive computation of DNNs make their deployment difficult on mobile and embedded devices. This work addresses model compression for face recognition, where the learned knowledge of a large teacher network or its ensemble is utilized as supervision to train a compact student network. Unlike previous works that represent the knowledge by the soften label probabilities, which are difficult to fit, we represent the knowledge by using the neurons at the higher hidden layer, which preserve as much information as the label probabilities, but are more compact. By leveraging the essential characteristics (domain knowledge) of the learned face representation, a neuron selection method is proposed to choose neurons that are most relevant to face recognition. Using the selected neurons as supervision to mimic the single networks of DeepID2+ and DeepID3, which are the state-of-the-art face recognition systems, a compact student with simple network structure achieves better verification accuracy on LFW than its teachers, respectively. When using an ensemble of DeepID2+ as teacher, a mimicked student is able to outperform it and achieves 51.6 compression ratio and 90 speed-up in inference, making this cumbersome model applicable on portable devices.
收录类别EI
语种英语
源URL[http://ir.siat.ac.cn:8080/handle/172644/10013]  
专题深圳先进技术研究院_集成所
作者单位2016
推荐引用方式
GB/T 7714
Ping Luo,Zhenyao Zhu,Ziwei Liu,et al. Face Model Compression by Distilling Knowledge from Neurons[C]. 见:AAAI 2016. 美国.

入库方式: OAI收割

来源:深圳先进技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。