中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Incremental Concept Learning via Online Generative Memory Recall

文献类型:期刊论文

作者Li, Huaiyu1,2; Dong, Weiming1; Hu, Bao-Gang1
刊名IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
出版日期2021-07-01
卷号32期号:7页码:3206-3216
ISSN号2162-237X
关键词Task analysis Learning systems Neural networks Feature extraction Visualization Knowledge engineering Training Catastrophic forgetting continual learning generative adversarial networks (GANs)
DOI10.1109/TNNLS.2020.3010581
通讯作者Dong, Weiming(weiming.dong@ia.ac.cn)
英文摘要The ability to learn more concepts from incrementally arriving data over time is essential for the development of a lifelong learning system. However, deep neural networks often suffer from forgetting previously learned concepts when continually learning new concepts, which is known as the catastrophic forgetting problem. The main reason for catastrophic forgetting is that past concept data are not available, and neural weights are changed during incrementally learning new concepts. In this article, we propose an incremental concept learning framework that includes two components, namely, ICLNet and RecallNet. ICLNet, which consists of a trainable feature extractor and a dynamic concept memory matrix, aims to learn new concepts incrementally. We propose a concept-contrastive loss to alleviate the magnitude of neural weight changes and mitigate the catastrophic forgetting problems. RecallNet aims to consolidate old concepts memory and recall pseudo samples, whereas ICLNet learns new concepts. We propose a balanced online memory recall strategy to reduce the information loss of old concept memory. We evaluate the proposed approach on the MNIST, Fashion-MNIST, and SVHN data sets and compare it with other pseudorehearsal-based approaches. Extensive experiments demonstrate the effectiveness of our approach.
资助项目National Key Research and Development Program of China[2018AAA0101005] ; National Natural Science Foundation of China[61832016] ; National Natural Science Foundation of China[61672520] ; National Natural Science Foundation of China[61720106006]
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000670541500033
资助机构National Key Research and Development Program of China ; National Natural Science Foundation of China
源URL[http://ir.ia.ac.cn/handle/173211/45280]  
专题自动化研究所_模式识别国家重点实验室_多媒体计算与图形学团队
通讯作者Dong, Weiming
作者单位1.Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
推荐引用方式
GB/T 7714
Li, Huaiyu,Dong, Weiming,Hu, Bao-Gang. Incremental Concept Learning via Online Generative Memory Recall[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2021,32(7):3206-3216.
APA Li, Huaiyu,Dong, Weiming,&Hu, Bao-Gang.(2021).Incremental Concept Learning via Online Generative Memory Recall.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,32(7),3206-3216.
MLA Li, Huaiyu,et al."Incremental Concept Learning via Online Generative Memory Recall".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 32.7(2021):3206-3216.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。