中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Advancing Spiking Neural Networks Toward Deep Residual Learning

文献类型:期刊论文

作者Hu, Yifan1,2; Deng, Lei1; Wu, Yujie3; Yao, Man2,4; Li, Guoqi2,4
刊名IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
出版日期2024-02-08
页码15
关键词Degradation problem neuromorphic computing residual neural network spiking neural network (SNN)
ISSN号2162-237X
DOI10.1109/TNNLS.2024.3355393
通讯作者Li, Guoqi()
英文摘要Despite the rapid progress of neuromorphic computing, inadequate capacity and insufficient representation power of spiking neural networks (SNNs) severely restrict their application scope in practice. Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks, but rarely did previous work assessed their applicability to the specifics of SNNs. In this article, we first identify that this negligence leads to impeded information flow and the accompanying degradation problem in a spiking version of vanilla ResNet. To address this issue, we propose a novel SNN-oriented residual architecture termed MS-ResNet, which establishes membrane-based shortcut pathways, and further proves that the gradient norm equality can be achieved in MS-ResNet by introducing block dynamical isometry theory, which ensures the network can be well-behaved in a depth-insensitive way. Thus, we are able to significantly extend the depth of directly trained SNNs, e.g., up to 482 layers on CIFAR-10 and 104 layers on ImageNet, without observing any slight degradation problem. To validate the effectiveness of MS-ResNet, experiments on both frame-based and neuromorphic datasets are conducted. MS-ResNet104 achieves a superior result of 76.02% accuracy on ImageNet, which is the highest to the best of our knowledge in the domain of directly trained SNNs. Great energy efficiency is also observed, with an average of only one spike per neuron needed to classify an input sample. We believe our powerful and scalable models will provide strong support for further exploration of SNNs.
WOS关键词INTELLIGENCE ; MODEL
资助项目National Science Foundation for Distinguished Young Scholars
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:001174240500001
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
资助机构National Science Foundation for Distinguished Young Scholars
源URL[http://ir.ia.ac.cn/handle/173211/57801]  
专题数字内容技术与服务研究中心_听觉模型与认知计算
通讯作者Li, Guoqi
作者单位1.Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
2.Peng Cheng Lab, Shenzhen 518066, Peoples R China
3.Graz Univ Technol, Inst Theoret Comp Sci, A-8010 Graz, Austria
4.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Hu, Yifan,Deng, Lei,Wu, Yujie,et al. Advancing Spiking Neural Networks Toward Deep Residual Learning[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2024:15.
APA Hu, Yifan,Deng, Lei,Wu, Yujie,Yao, Man,&Li, Guoqi.(2024).Advancing Spiking Neural Networks Toward Deep Residual Learning.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,15.
MLA Hu, Yifan,et al."Advancing Spiking Neural Networks Toward Deep Residual Learning".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2024):15.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。