Contrastive Correlation Preserving Replay for Online Continual Learning
文献类型:期刊论文
作者 | Yu, Da4; Zhang, Mingyi2,3; Li, Mantian4; Zha, Fusheng4; Zhang, Junge2,3; Sun, Lining4; Huang, Kaiqi1,2,3 |
刊名 | IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY |
出版日期 | 2024 |
卷号 | 34期号:1页码:124-139 |
ISSN号 | 1051-8215 |
关键词 | Task analysis Correlation Knowledge transfer Training Memory management Data models Mutual information Continual learning catastrophic forgetting class-incremental learning experience replay |
DOI | 10.1109/TCSVT.2023.3285221 |
通讯作者 | Zha, Fusheng(zhafusheng@hit.edu.cn) ; Zhang, Junge(jgzhang@nlpr.ia.ac.cn) |
英文摘要 | Online Continual Learning (OCL), as a core step towards achieving human-level intelligence, aims to incrementally learn and accumulate novel concepts from streaming data that can be seen only once, while alleviating catastrophic forgetting on previously acquired knowledge. Under this mode, the model needs to learn new classes or tasks in an online manner, and the data distribution may change over time. Moreover, task boundaries and identities are not available during training and evaluation. To balance the stability and plasticity of networks, in this work, we propose a replay-based framework for OCL, named Contrastive Correlation Preserving Replay (CCPR), which focuses on not only instances but also correlations between multiple instances. Specifically, besides the previous raw samples, the corresponding representations are stored in the memory and used to construct correlations for the past and the current model. To better capture correlation and higher-order dependencies, we maximize the low bound of mutual information between the past correlation and the current correlation by leveraging contrastive objectives. Furthermore, to improve the performance, we propose a new memory update strategy, which simultaneously encourages the balance and diversity of samples within the memory. With limited memory slots, it allows less redundant and more representative samples for later replay. We conduct extensive evaluations on several popular CL datasets, and experiments show that our method consistently outperforms the state-of-the-art methods and can effectively consolidate knowledge to alleviate forgetting. |
WOS关键词 | KNOWLEDGE |
资助项目 | National Natural Science Foundation of China |
WOS研究方向 | Engineering |
语种 | 英语 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
WOS记录号 | WOS:001138814400038 |
资助机构 | National Natural Science Foundation of China |
源URL | [http://ir.ia.ac.cn/handle/173211/55545] |
专题 | 复杂系统认知与决策实验室 |
通讯作者 | Zha, Fusheng; Zhang, Junge |
作者单位 | 1.CAS Ctr Excellence Brain Sci & Intelligence Techno, Shanghai 200031, Peoples R China 2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China 3.Chinese Acad Sci CASIA, Inst Automat, Ctr Res Intelligent Syst & Engn, Beijing 100190, Peoples R China 4.Harbin Inst Technol HIT, State Key Lab Robot & Syst, Harbin 150080, Peoples R China |
推荐引用方式 GB/T 7714 | Yu, Da,Zhang, Mingyi,Li, Mantian,et al. Contrastive Correlation Preserving Replay for Online Continual Learning[J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY,2024,34(1):124-139. |
APA | Yu, Da.,Zhang, Mingyi.,Li, Mantian.,Zha, Fusheng.,Zhang, Junge.,...&Huang, Kaiqi.(2024).Contrastive Correlation Preserving Replay for Online Continual Learning.IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY,34(1),124-139. |
MLA | Yu, Da,et al."Contrastive Correlation Preserving Replay for Online Continual Learning".IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY 34.1(2024):124-139. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。