中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning

文献类型:期刊论文

作者Li KC(李焜炽); Wan J(万军); Yu S(余山)
刊名IEEE TRANSACTIONS ON IMAGE PROCESSING
出版日期2022
卷号31页码:3825–3837
英文摘要

Abstract— Recently, owing to the superior performances,
knowledge distillation-based (kd-based) methods with the exem plar rehearsal have been widely applied in class incremental
learning (CIL). However, we discover that they suffer from
the feature uncalibration problem, which is caused by directly
transferring knowledge from the old model immediately to the
new model when learning a new task. As the old model confuses
the feature representations between the learned and new classes,
the kd loss and the classification loss used in kd-based methods
are heterogeneous. This is detrimental if we learn the existing
knowledge from the old model directly in the way as in typical
kd-based methods. To tackle this problem, the feature calibration
network (FCN) is proposed, which is used to calibrate the existing
knowledge to alleviate the feature representation confusion of the
old model. In addition, to relieve the task-recency bias of FCN
caused by the limited storage memory in CIL, we propose a novel
image-feature hybrid sample rehearsal strategy to train FCN
by splitting the memory budget to store the image-and-feature
exemplars of the previous tasks. As feature embeddings of images
have much lower-dimensions, this allows us to store more samples
to train FCN. Based on these two improvements, we propose the
Cascaded Knowledge Distillation Framework (CKDF) including
three main stages. The first stage is used to train FCN to
calibrate the existing knowledge of the old model. Then, the new
model is trained simultaneously by transferring knowledge from
the calibrated teacher model through the knowledge distillation
strategy and learning new classes. Finally, after completing the
new task learning, the feature exemplars of previous tasks are
updated. Importantly, we demonstrate that the proposed CKDF
is a general framework that can be applied to various kd-based
methods. Experimental results show that our method achieves
state-of-the-art performances on several CIL benchmarks.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/56635]  
专题自动化研究所_脑网络组研究中心
通讯作者Yu S(余山)
作者单位1.Institute of Automation, Chinese Academy of Sciences
2.the School of Artificial Intelligence, University of Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Li KC,Wan J,Yu S. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2022,31:3825–3837.
APA Li KC,Wan J,&Yu S.(2022).CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.IEEE TRANSACTIONS ON IMAGE PROCESSING,31,3825–3837.
MLA Li KC,et al."CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning".IEEE TRANSACTIONS ON IMAGE PROCESSING 31(2022):3825–3837.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。