Revisiting Parameter Sharing for Automatic Neural Channel Number Search
文献类型:会议论文
作者 | Wang JX(王家兴)4,6![]() ![]() ![]() |
出版日期 | 2020 |
会议日期 | 2020.12.06-2020.12.12 |
会议地点 | Online |
关键词 | Neural Architecture Search Model Compression Parameter Sharing |
英文摘要 | Recent advances in neural architecture search inspire many channel number search algorithms (CNS) for convolutional neural networks. To improve searching efficiency, parameter sharing is widely applied, which reuses parameters among different channel configurations. Nevertheless, it is unclear how parameter sharing affects the searching process. In this paper, we aim at providing a better understanding and exploitation of parameter sharing for CNS. Specifically, we propose affine parameter sharing (APS) as a general formulation to unify and quantitatively analyze existing channel search algorithms. It is found that with parameter sharing, weight updates of one architecture can simultaneously benefit other candidates. However, it also results in less confidence in choosing good architectures. We thus propose a new strategy of parameter sharing towards a better balance between training efficiency and architecture discrimination. Extensive analysis and experiments demonstrate the superiority of the proposed strategy in channel configuration against many state-of-the-art counterparts on benchmark datasets. |
会议录出版者 | Curran Associates, Inc. |
语种 | 英语 |
源URL | [http://ir.ia.ac.cn/handle/173211/44748] ![]() |
专题 | 类脑芯片与系统研究 |
作者单位 | 1.University of Texas at Arlington 2.Northeastern University 3.Tencent AI Lab 4.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100190, Peoples R China 5.The Chinese University of Hong Kong 6.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Wang JX,Bo HL,Wu JX,et al. Revisiting Parameter Sharing for Automatic Neural Channel Number Search[C]. 见:. Online. 2020.12.06-2020.12.12. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。