Learning Lightweight Dynamic Kernels With Attention Inside via Local-Global Context Fusion
文献类型:期刊论文
作者 | Tian, Yonglin4![]() ![]() ![]() ![]() ![]() ![]() |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2022-11-14 |
页码 | 15 |
关键词 | Attention inside kernels dynamic convolution global context local context |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2022.3217301 |
通讯作者 | Wang, Fei-Yue(feiyue.wang@ia.ac.cn) |
英文摘要 | Traditional convolutional neural networks (CNNs) share their kernels among all positions of the input, which may constrain the representation ability in feature extraction. Dynamic convolution proposes to generate different kernels for different inputs to improve the model capacity. However, the total parameters of the dynamic network can be significantly huge. In this article, we propose a lightweight dynamic convolution method to strengthen traditional CNNs with an affordable increase of total parameters and multiply-adds. Instead of generating the whole kernels directly or combining several static kernels, we choose to "look inside ", learning the attention within convolutional kernels. An extra network is used to adjust the weights of kernels for every feature aggregation operation. By combining local and global contexts, the proposed approach can capture the variance among different samples, the variance in different positions of the feature maps, and the variance in different positions inside sliding windows. With a minor increase in the number of model parameters, remarkable improvements in image classification on CIFAR and ImageNet with multiple backbones have been obtained. Experiments on object detection also verify the effectiveness of the proposed method. |
资助项目 | Key-Area Research and Development Program of GuangdongProvince[2020B090921003] ; Key Research andDevelopment Program of Guangzhou[202007050002] ; Natural Science Key Foundation of Jiangsu Education Department[21KJA510004] ; Intel Collaborative Research Institute forIntelligent and Automated Connected Vehicles (ICRI-IACV) ; National Natural Science Foundation of China[62076020] ; National Natural Science Foundation of China[U1811463] ; National Natural Science Foundation of China[61976120] ; National Natural Science Foundation of China[62173329] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000886698500001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | Key-Area Research and Development Program of GuangdongProvince ; Key Research andDevelopment Program of Guangzhou ; Natural Science Key Foundation of Jiangsu Education Department ; Intel Collaborative Research Institute forIntelligent and Automated Connected Vehicles (ICRI-IACV) ; National Natural Science Foundation of China |
源URL | [http://ir.ia.ac.cn/handle/173211/51278] ![]() |
专题 | 自动化研究所_复杂系统管理与控制国家重点实验室_先进控制与自动化团队 |
通讯作者 | Wang, Fei-Yue |
作者单位 | 1.Univ Science & Technol China, Natl Engn Lab Brain inspired Intelligence Technol, Hefei 230027, Peoples R China 2.Nantong Univ, Sch Informat Sci & Technol, Nantong 226019, Peoples R China 3.Beijing Univ Chem Technol, Coll Informat Sci & Technol, Beijing 100029, Peoples R China 4.Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Tian, Yonglin,Shen, Yu,Wang, Xiao,et al. Learning Lightweight Dynamic Kernels With Attention Inside via Local-Global Context Fusion[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:15. |
APA | Tian, Yonglin.,Shen, Yu.,Wang, Xiao.,Wang, Jiangong.,Wang, Kunfeng.,...&Wang, Fei-Yue.(2022).Learning Lightweight Dynamic Kernels With Attention Inside via Local-Global Context Fusion.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,15. |
MLA | Tian, Yonglin,et al."Learning Lightweight Dynamic Kernels With Attention Inside via Local-Global Context Fusion".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):15. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。