Improving One-Shot NAS with Shrinking-and-Expanding Supernet
文献类型:期刊论文
作者 | Hu YM(胡一鸣)![]() |
刊名 | Pattern Recognition
![]() |
出版日期 | 2021-05-24 |
卷号 | 118期号:0页码:0 |
关键词 | Neural architecture search supernet Search space shrinking |
文献子类 | SCI |
英文摘要 | Training a supernet using a copy of shared weights has become a popular approach to speed up neural ar- chitecture search (NAS). However, it is difficult for supernet to accurately evaluate on a large-scale search space due to high weight coupling in weight-sharing setting. To address this, we present a shrinking- and-expanding supernet that decouples the shared parameters by reducing the degree of weight shar- ing, avoiding unstable and inaccurate performance estimation as in previous methods. Specifically, we propose a new shrinking strategy that progressively simplifies the original search space by discarding unpromising operators in a smart way. Based on this, we further present an expanding strategy by ap- propriately increasing parameters of the shrunk supernet. We provide comprehensive evidences showing that, in weight-sharing supernet, the proposed method SE-NAS brings more accurate and more stable performance estimation. Experimental results on ImageNet dataset indicate that SE-NAS achieves higher Top-1 accuracy than its counterparts under the same complexity constraint and search space. The abla- tion study is presented to further understand SE-NAS. |
语种 | 英语 |
源URL | [http://ir.ia.ac.cn/handle/173211/44836] ![]() |
专题 | 精密感知与控制研究中心_精密感知与控制 |
作者单位 | 1.School of Artificial Intelligence, University of Chinese Academy of Sciences 2.Institute of Automation, Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Hu YM. Improving One-Shot NAS with Shrinking-and-Expanding Supernet[J]. Pattern Recognition,2021,118(0):0. |
APA | Hu YM.(2021).Improving One-Shot NAS with Shrinking-and-Expanding Supernet.Pattern Recognition,118(0),0. |
MLA | Hu YM."Improving One-Shot NAS with Shrinking-and-Expanding Supernet".Pattern Recognition 118.0(2021):0. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。