中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation

文献类型:期刊论文

作者Ding, Yuhe5; Sheng, Lijun1,3,4; Liang, Jian2,3,4; Zheng, Aihua6; He, Ran2,3,4
刊名NEURAL NETWORKS
出版日期2023-10-01
卷号167页码:92-103
ISSN号0893-6080
关键词Source-free unsupervised domain adaptation Pseudo labeling
DOI10.1016/j.neunet.2023.08.005
通讯作者Liang, Jian(liangjian92@gmail.com)
英文摘要Due to privacy concerns and data transmission issues, Source-free Unsupervised Domain Adaptation (SFDA) has gained popularity. It exploits pre-trained source models, rather than raw source data for target learning, to transfer knowledge from a labeled source domain to an unlabeled target domain. Existing methods solve this problem typically with additional parameters or noisy pseudo labels, and we propose an effective method named Proxy-based Mixup training with label refinery (ProxyMix) to avoid these drawbacks. To avoid additional parameters and leverages information in the source model, ProxyMix defines classifier weights as class prototypes and creates a class-balanced proxy source domain using nearest neighbors of the prototypes. To improve the reliability of pseudo labels, we further propose the frequency-weighted aggregation strategy to generate soft pseudo labels for unlabeled target data. Our strategy utilizes target features' internal structure, increases weights of low frequency class samples, and aligns the proxy and target domains using inter-and intra-domain mixup regularization. This mitigates the negative impact of noisy labels. Experiments on three 2D image and 3D point cloud object recognition benchmarks demonstrate that ProxyMix yields state-of-the-art performance for source-free UDA tasks. (c) 2023 Elsevier Ltd. All rights reserved.
资助项目National Natural Science Foundation of China[62276256] ; Beijing Nova Program, China[Z211100002121108] ; University Synergy Innovation Program of Anhui Province[GXXT-2022-036]
WOS研究方向Computer Science ; Neurosciences & Neurology
语种英语
出版者PERGAMON-ELSEVIER SCIENCE LTD
WOS记录号WOS:001068310300001
资助机构National Natural Science Foundation of China ; Beijing Nova Program, China ; University Synergy Innovation Program of Anhui Province
源URL[http://ir.ia.ac.cn/handle/173211/53089]  
专题多模态人工智能系统全国重点实验室
通讯作者Liang, Jian
作者单位1.Univ Sci & Technol China, Hefei, Peoples R China
2.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China
3.Chinese Acad Sci CASIA, Inst Automat, Ctr Res Intelligent Percept & Comp, Beijing, Peoples R China
4.Chinese Acad Sci CASIA, State Key Lab Multimodal Artificial Intelligence S, Beijing, Peoples R China
5.Anhui Univ, Sch Comp Sci & Technol, Hefei, Peoples R China
6.Anhui Univ, Sch Artificial Intelligence, Hefei, Peoples R China
推荐引用方式
GB/T 7714
Ding, Yuhe,Sheng, Lijun,Liang, Jian,et al. ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation[J]. NEURAL NETWORKS,2023,167:92-103.
APA Ding, Yuhe,Sheng, Lijun,Liang, Jian,Zheng, Aihua,&He, Ran.(2023).ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation.NEURAL NETWORKS,167,92-103.
MLA Ding, Yuhe,et al."ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation".NEURAL NETWORKS 167(2023):92-103.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。