MaxMatch: Semi-Supervised Learning With Worst-Case Consistency
文献类型:期刊论文
作者 | Jiang, Yangbangyan3,5; Li, Xiaodan4; Chen, Yuefeng4; He, Yuan4; Xu, Qianqian6; Yang, Zhiyong7; Cao, Xiaochun8; Huang, Qingming1,2,6,7 |
刊名 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
![]() |
出版日期 | 2023-05-01 |
卷号 | 45期号:5页码:5970-5987 |
关键词 | Predictive models Training Data models Semantics Perturbation methods Computational modeling Benchmark testing Semi-supervised learning consistency regularization worst-case consistency image classification |
ISSN号 | 0162-8828 |
DOI | 10.1109/TPAMI.2022.3208419 |
英文摘要 | In recent years, great progress has been made to incorporate unlabeled data to overcome the inefficiently supervised problem via semi-supervised learning (SSL). Most state-of-the-art models are based on the idea of pursuing consistent model predictions over unlabeled data toward the input noise, which is called consistency regularization. Nonetheless, there is a lack of theoretical insights into the reason behind its success. To bridge the gap between theoretical and practical results, we propose a worst-case consistency regularization technique for SSL in this article. Specifically, we first present a generalization bound for SSL consisting of the empirical loss terms observed on labeled and unlabeled training data separately. Motivated by this bound, we derive an SSL objective that minimizes the largest inconsistency between an original unlabeled sample and its multiple augmented variants. We then provide a simple but effective algorithm to solve the proposed minimax problem, and theoretically prove that it converges to a stationary point. Experiments on five popular benchmark datasets validate the effectiveness of our proposed method. |
资助项目 | National Key R&D Program of China[2018AAA0102000] ; National Natural Science Foundation of China[U21B2038] ; National Natural Science Foundation of China[61931008] ; National Natural Science Foundation of China[62025604] ; National Natural Science Foundation of China[U1936208] ; National Natural Science Foundation of China[6212200758] ; National Natural Science Foundation of China[61976202] ; Fundamental Research Funds for the Central Universities ; Youth Innovation Promotion Association CAS ; Strategic Priority Research Program of Chinese Academy of Sciences[XDB28000000] ; China National Postdoctoral Program for Innovative Talents[BX2021298] ; China Postdoctoral Science Foundation[2022M713101] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000964792800040 |
出版者 | IEEE COMPUTER SOC |
源URL | [http://119.78.100.204/handle/2XEOYT63/21397] ![]() |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Xu, Qianqian; Huang, Qingming |
作者单位 | 1.Univ Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management BDKM, Beijing 101408, Peoples R China 2.Peng Cheng Lab, Shenzhen 518055, Peoples R China 3.Chinese Acad Sci, Inst Informat Engn, State Key Lab Informat Secur, Beijing 100093, Peoples R China 4.Alibaba Grp, Secur Dept, Hangzhou 311121, Peoples R China 5.Univ Chinese Acad Sci, Sch Cyber Secur, Beijing 100049, Peoples R China 6.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China 7.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 101408, Peoples R China 8.Sun Yat Sen Univ, Sch Cyber Sci & Technol, Shenzhen Campus, Shenzhen 518107, Peoples R China |
推荐引用方式 GB/T 7714 | Jiang, Yangbangyan,Li, Xiaodan,Chen, Yuefeng,et al. MaxMatch: Semi-Supervised Learning With Worst-Case Consistency[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2023,45(5):5970-5987. |
APA | Jiang, Yangbangyan.,Li, Xiaodan.,Chen, Yuefeng.,He, Yuan.,Xu, Qianqian.,...&Huang, Qingming.(2023).MaxMatch: Semi-Supervised Learning With Worst-Case Consistency.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,45(5),5970-5987. |
MLA | Jiang, Yangbangyan,et al."MaxMatch: Semi-Supervised Learning With Worst-Case Consistency".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 45.5(2023):5970-5987. |
入库方式: OAI收割
来源:计算技术研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。