An Efficient Knowledge Transfer Strategy for Spiking Neural Networks from Static to Event Domain
文献类型:会议论文
作者 | He, Xiang2,4; Zhao, Dongcheng4![]() ![]() ![]() ![]() |
出版日期 | 2024 |
会议日期 | 2024-02-20 |
会议地点 | VANCOUVER, CANADA |
DOI | https://doi.org/10.1609/aaai.v38i1.27806 |
英文摘要 | Spiking neural networks (SNNs) are rich in spatio-temporal dynamics and are suitable for processing event-based neuromorphic data. However, event-based datasets are usually less annotated than static datasets. This small data scale makes SNNs prone to overfitting and limits their performance. In order to improve the generalization ability of SNNs on event-based datasets, we use static images to assist SNN training on event data. In this paper, we first discuss the domain mismatch problem encountered when directly transferring networks trained on static datasets to event data. We argue that the inconsistency of feature distributions becomes a major factor hindering the effective transfer of knowledge from static images to event data. To address this problem, we propose solutions in terms of two aspects: feature distribution and training strategy. Firstly, we propose a knowledge transfer loss, which consists of domain alignment loss and spatio-temporal regularization. The domain alignment loss learns domain-invariant spatial features by reducing the marginal distribution distance between the static image and the event data. Spatio-temporal regularization provides dynamically learnable coefficients for domain alignment loss by using the output features of the event data at each time step as a regularization term. In addition, we propose a sliding training strategy, which gradually replaces static image inputs probabilistically with event data, resulting in a smoother and more stable training for the network. We validate our method on neuromorphic datasets, including N-Caltech101, CEP-DVS, and N-Omniglot. The experimental results show that our proposed method achieves better performance on all datasets compared to the current state-of-the-art methods. Code is available at https://github.com/Brain-Cog-Lab/Transfer-for-DVS. |
语种 | 英语 |
URL标识 | 查看原文 |
源URL | [http://ir.ia.ac.cn/handle/173211/57241] ![]() |
专题 | 类脑智能研究中心_类脑认知计算 |
通讯作者 | Kong, Qingqun; Zeng, Yi |
作者单位 | 1.School of Future Technology, University of Chinese Academy of Sciences, Beijing, China 2.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China 3.Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China 4.Brain-inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences, Beijing, China |
推荐引用方式 GB/T 7714 | He, Xiang,Zhao, Dongcheng,Li, Yang,et al. An Efficient Knowledge Transfer Strategy for Spiking Neural Networks from Static to Event Domain[C]. 见:. VANCOUVER, CANADA. 2024-02-20. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。