A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization
文献类型:期刊论文
作者 | Kang, Nan2,3; Chang, Hong1,2,3; Ma, Bingpeng4; Shan, Shiguang1,2,3 |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2022-07-27 |
页码 | 13 |
关键词 | Training Tail Task analysis Head Visualization Feature extraction Data models Classifier design contrastive learning long-tailed recognition normalization |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2022.3192475 |
英文摘要 | Data in the visual world often present long-tailed distributions. However, learning high-quality representations and classifiers for imbalanced data is still challenging for data-driven deep learning models. In this work, we aim at improving the feature extractor and classifier for long-tailed recognition via contrastive pretraining and feature normalization, respectively. First, we carefully study the influence of contrastive pretraining under different conditions, showing that current self-supervised pretraining for long-tailed learning is still suboptimal in both performance and speed. We thus propose a new balanced contrastive loss and a fast contrastive initialization scheme to improve previous long-tailed pretraining. Second, based on the motivative analysis on the normalization for classifier, we propose a novel generalized normalization classifier that consists of generalized normalization and grouped learnable scaling. It outperforms traditional inner product classifier as well as cosine classifier. Both the two components proposed can improve recognition ability on tail classes without the expense of head classes. We finally build a unified framework that achieves competitive performance compared with state of the arts on several long-tailed recognition benchmarks and maintains high efficiency. |
资助项目 | Natural Science Foundation of China (NSFC)[U19B2036] ; Natural Science Foundation of China (NSFC)[61976203] ; Natural Science Foundation of China (NSFC)[61876171] ; Fundamental Research Funds for the Central Universities |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000833050600001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
源URL | [http://119.78.100.204/handle/2XEOYT63/19486] ![]() |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Chang, Hong |
作者单位 | 1.Peng Cheng Lab, Shenzhen 518055, Peoples R China 2.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China 3.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China 4.Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing 100049, Peoples R China |
推荐引用方式 GB/T 7714 | Kang, Nan,Chang, Hong,Ma, Bingpeng,et al. A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2022:13. |
APA | Kang, Nan,Chang, Hong,Ma, Bingpeng,&Shan, Shiguang.(2022).A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,13. |
MLA | Kang, Nan,et al."A Comprehensive Framework for Long-Tailed Learning via Pretraining and Normalization".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2022):13. |
入库方式: OAI收割
来源:计算技术研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。