中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks

文献类型:期刊论文

作者Chen, Zhiqiang1,3; Xu, Ting-Bing1,4; Du, Changde1,3; Liu, Cheng-Lin1,2,4; He, Huiguang1,2,3,4
刊名IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
出版日期2020-04
卷号期号:页码:
关键词Conditional accuracy change (CAC), direct criterion, dynamical channel pruning, neural network compression, structure shaping.
ISSN号1045-9227
DOI10.1109/TNNLS.2020.2979517
通讯作者He, Huiguang(huiguang.he@ia.ac.cn)
文献子类长文
英文摘要

Channel pruning is an effective technique that has been widely applied to deep neural network compression. However, many existing methods prune from a pretrained model, thus resulting in repetitious pruning and fine-tuning processes. In this article, we propose a dynamical channel pruning method, which prunes unimportant channels at the early stage of training. Rather than utilizing some indirect criteria (e.g., weight norm, absolute weight sum, and reconstruction error) to guide connection or channel pruning, we design criteria directly related to the final accuracy of a network to evaluate the importance of each channel. Specifically, a channelwise gate is designed to randomly enable or disable each channel so that the conditional accuracy changes (CACs) can be estimated under the condition of each channel disabled. Practically, we construct two effective and efficient criteria to dynamically estimate CAC at each iteration of training; thus, unimportant channels can be gradually pruned during the training process. Finally, extensive experiments on multiple data sets (i.e., ImageNet, CIFAR, and MNIST) with various networks (i.e., ResNet, VGG, and MLP) demonstrate that the proposed method effectively reduces the parameters and computations of baseline network while yielding the higher or competitive accuracy. Interestingly, if we Double the initial Channels and then Prune Half (DCPH) of them to baseline’s counterpart, it can enjoy a remarkable performance improvement by shaping a more desirable structure

URL标识查看原文
资助项目National Natural Science Foundation of China[61976209] ; National Natural Science Foundation of China[61721004] ; CAS International Collaboration Key Project ; Strategic Priority Research Program of CAS[XDB32040200]
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000616310400027
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
资助机构National Natural Science Foundation of China ; CAS International Collaboration Key Project ; Strategic Priority Research Program of CAS
源URL[http://ir.ia.ac.cn/handle/173211/42215]  
专题类脑智能研究中心_神经计算及脑机交互
通讯作者He, Huiguang
作者单位1.University of Chinese Academy of Sciences (UCAS), Beijing 100049, China
2.Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Beijing 100190, China
3.Research Center for Brain-Inspired Intelligence, Institute of Automation, Chinese Academy of Sciences (CASIA), Beijing 100190, China
4.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences (CASIA), Beijing 100190, China
推荐引用方式
GB/T 7714
Chen, Zhiqiang,Xu, Ting-Bing,Du, Changde,et al. Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2020,无(无):无.
APA Chen, Zhiqiang,Xu, Ting-Bing,Du, Changde,Liu, Cheng-Lin,&He, Huiguang.(2020).Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,无(无),无.
MLA Chen, Zhiqiang,et al."Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 无.无(2020):无.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。