Self-centralized jointly sparse maximum margin criterion for robust dimensionality reduction
文献类型:期刊论文
作者 | Hu, Liangchen1; Xu, Jingke3; Tian, Lei2,4![]() ![]() |
刊名 | KNOWLEDGE-BASED SYSTEMS
![]() |
出版日期 | 2020-10-28 |
卷号 | 206页码:15 |
关键词 | Maximum margin criterion Robustness Adaptive centroid L-1,L-2-norm sparsity Dimensionality reduction |
ISSN号 | 0950-7051 |
DOI | 10.1016/j.knosys.2020.106343 |
通讯作者 | Zhang, Wensheng(zhangwenshengia@hotmail.com) |
英文摘要 | Linear discriminant analysis (LDA) is among the most popular supervised dimensionality reduction algorithms, which has been largely followed in the fields of pattern recognition and data mining. However, LDA has three major drawbacks. One is the challenge brought by small-sample-size (SSS) problem; second makes it sensitive to outliers due to the use of squared L-2-norms in the scatter loss evaluation; the third is the case that the feature loadings in projection matrix are relatively redundant and there is a risk of overfitting. In this paper, we put forward a novel functional expression for LDA, which combines maximum margin criterion (MMC) with a weighted strategy formulated by L-1,L-2-norms to against outliers. Meanwhile, we simultaneously realize the adaptive calculation of weighted intra-class and global centroid to further reduce the influence of outliers, and employ the L-2,L-1-norm to constrain row sparsity so that subspace learning and feature selection could be performed cooperatively. Besides, an effective alternating iterative algorithm is derived and its convergence is verified. From the complexity analysis, our proposed algorithm can deal with large-scale data processing. Our proposed model can address the sensitivity problem of outliers and extract the most representative features while preventing overfitting effectively. Experiments performed on several benchmark databases demonstrate that the proposed algorithm is more effective than some other state-of-the-art methods and has better generalization performance. (C) 2020 Elsevier B.V. All rights reserved. |
WOS关键词 | LINEAR DISCRIMINANT-ANALYSIS ; FACE RECOGNITION |
资助项目 | National Key R&D Program of China[2017YFC0806500] ; National Natural Science Foundation of China[U1636220] ; National Natural Science Foundation of China[61876183] ; National Natural Science Foundation of China[61772525] |
WOS研究方向 | Computer Science |
语种 | 英语 |
WOS记录号 | WOS:000572851100006 |
出版者 | ELSEVIER |
资助机构 | National Key R&D Program of China ; National Natural Science Foundation of China |
源URL | [http://ir.ia.ac.cn/handle/173211/42022] ![]() |
专题 | 精密感知与控制研究中心_人工智能与机器学习 |
通讯作者 | Zhang, Wensheng |
作者单位 | 1.Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China 2.Chinese Acad Sci, Res Ctr Precis Sensing & Control, Inst Automat, Beijing 100190, Peoples R China 3.Shandong Agr Univ, Coll Informat Sci & Engn, Tai An 271018, Shandong, Peoples R China 4.Univ Chinese Acad Sci, Beijing 101408, Peoples R China |
推荐引用方式 GB/T 7714 | Hu, Liangchen,Xu, Jingke,Tian, Lei,et al. Self-centralized jointly sparse maximum margin criterion for robust dimensionality reduction[J]. KNOWLEDGE-BASED SYSTEMS,2020,206:15. |
APA | Hu, Liangchen,Xu, Jingke,Tian, Lei,&Zhang, Wensheng.(2020).Self-centralized jointly sparse maximum margin criterion for robust dimensionality reduction.KNOWLEDGE-BASED SYSTEMS,206,15. |
MLA | Hu, Liangchen,et al."Self-centralized jointly sparse maximum margin criterion for robust dimensionality reduction".KNOWLEDGE-BASED SYSTEMS 206(2020):15. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。