中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
ESDB: Expand The Shrinking Decision Boundary via One-to-Many Information Matching for Continual Learning with Small Memory

文献类型:期刊论文

作者Kunchi Li2,3; Hongyang Chen1; Jun Wan2,3; Shan YU2,3
刊名IEEE Transactions on Circuits and Systems for Video Technology
出版日期2024
页码/
英文摘要
Abstract—Rehearsal methods based on knowledge distillation
(KD) have been widely used in continual learning (CL). However,
given memory constraints, few exemplars contain limited vari
ations of previously learned tasks, impeding the effectiveness of
KD in retaining long-term knowledge. The decision boundaries
learned by the typical KD strategy overfit the limited exemplars,
leading to “shrunk boundaries” of the old classes. To tackle
this problem, we propose a novel KD strategy, called One-to
Many Information Matching method (O2MIM), which generates
interpolated data by mixing samples between old and new classes,
disentangles the supervision information from them and assigns
supervision information to them in favor of the old classes. By
doing so, the supervision information from a single exemplar can
be matched with multiple information from different interpolated
images. Moreover, O2MIM utilizes one trainable parameter
to create an adaptive KD loss, thereby facilitating a flexible
matching process with the designated supervision information.
Consequently, O2MIM exploits the exemplar corset more effec
tively, expanding the shrunk decision boundaries towards the new
classes. Next, to incorporate new classes into our classification
model, we apply an effective classification training strategy to
train a debiased classifier. Combining it with O2MIM, we propose
the method of Expanding the Shrinking Decision Boundaries
(ESDB), which simultaneously transfers knowledge from the old
model via O2MIM and learns new classes by the classification
training strategy. Extensive experiments demonstrate that ESDB
achieves state-of-the-art performance on diverse CL benchmarks.
We also confirm that O2MIM can be used with various label
mixing methods to improve overall performance in CL. The code
is available at: https://github.com/CSTiger77/ESDB.
URL标识查看原文
语种英语
源URL[http://ir.ia.ac.cn/handle/173211/57192]  
专题自动化研究所_脑网络组研究中心
通讯作者Jun Wan; Shan YU
作者单位1.Zhejiang Lab
2.Institute of Automation, Chinese Academy of Sciences,
3.School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Kunchi Li,Hongyang Chen,Jun Wan,et al. ESDB: Expand The Shrinking Decision Boundary via One-to-Many Information Matching for Continual Learning with Small Memory[J]. IEEE Transactions on Circuits and Systems for Video Technology,2024:/.
APA Kunchi Li,Hongyang Chen,Jun Wan,&Shan YU.(2024).ESDB: Expand The Shrinking Decision Boundary via One-to-Many Information Matching for Continual Learning with Small Memory.IEEE Transactions on Circuits and Systems for Video Technology,/.
MLA Kunchi Li,et al."ESDB: Expand The Shrinking Decision Boundary via One-to-Many Information Matching for Continual Learning with Small Memory".IEEE Transactions on Circuits and Systems for Video Technology (2024):/.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。