Feature Selection Based on Structured Sparsity: A Comprehensive Study
文献类型:期刊论文
作者 | Gui, Jie1; Sun, Zhenan2![]() ![]() |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2017-07-01 |
卷号 | 28期号:7页码:1490-1507 |
关键词 | Dimensionality Reduction Feature Selection Sparse Structured Sparsity |
DOI | 10.1109/TNNLS.2016.2551724 |
文献子类 | Article |
英文摘要 | Feature selection (FS) is an important component of many pattern recognition tasks. In these tasks, one is often confronted with very high-dimensional data. FS algorithms are designed to identify the relevant feature subset from the original features, which can facilitate subsequent analysis, such as clustering and classification. Structured sparsity-inducing feature selection (SSFS) methods have been widely studied in the last few years, and a number of algorithms have been proposed. However, there is no comprehensive study concerning the connections between different SSFS methods, and how they have evolved. In this paper, we attempt to provide a survey on various SSFS methods, including their motivations and mathematical representations. We then explore the relationship among different formulations and propose a taxonomy to elucidate their evolution. We group the existing SSFS methods into two categories, i.e., vector-based feature selection (feature selection based on lasso) and matrix-based feature selection (feature selection based on l(r, p)-norm). Furthermore, FS has been combined with other machine learning algorithms for specific applications, such as multitask learning, multilabel learning, multiview learning, classification, and clustering. This paper not only compares the differences and commonalities of these methods based on regression and regularization strategies, but also provides useful guidelines to practitioners working in related fields to guide them how to do feature selection. |
WOS关键词 | SUPPORT VECTOR MACHINES ; UNSUPERVISED FEATURE-SELECTION ; ROBUST FEATURE-EXTRACTION ; MAXIMUM MARGIN CRITERION ; DIMENSIONALITY REDUCTION ; VARIABLE SELECTION ; GENE SELECTION ; LOGISTIC-REGRESSION ; IMAGING GENETICS ; CLASSIFICATION |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000404048300001 |
资助机构 | National Natural Science Foundation of China(61572463) ; "Thirteenth Five-Year" National Key Research and Development Program of China(2016YFD0702002) ; Open Project Program of the National Laboratory of Pattern Recognition (NLPR)(201700027) ; Open Project Program of the State Key Lab of CADCG(A1709) ; Zhejiang University ; Shanghai Key Laboratory of Intelligent Information Processing, China(IIPL-2016-003) ; Australian Research Council(DP-140102164 ; U.S. National Science Foundation(DBI-1147134 ; FT-130101457 ; DBI-1350258) ; LE140100061) |
源URL | [http://ir.ia.ac.cn/handle/173211/15236] ![]() |
专题 | 自动化研究所_智能感知与计算研究中心 |
作者单位 | 1.Chinese Acad Sci, Inst Intelligent Machines, Hefei 230031, Peoples R China 2.Chinese Acad Sci, Inst Automat, Ctr Res Intelligent Percept & Comp, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China 3.Washington State Univ, Sch Elect Engn & Comp Sci, Pullman, WA 99164 USA 4.Univ Technol Sydney, Fac Engn & Informat Technol, Ctr Quantum Computat & Intelligent Syst, Ultimo, NSW 2007, Australia |
推荐引用方式 GB/T 7714 | Gui, Jie,Sun, Zhenan,Ji, Shuiwang,et al. Feature Selection Based on Structured Sparsity: A Comprehensive Study[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2017,28(7):1490-1507. |
APA | Gui, Jie,Sun, Zhenan,Ji, Shuiwang,Tao, Dacheng,&Tan, Tieniu.(2017).Feature Selection Based on Structured Sparsity: A Comprehensive Study.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,28(7),1490-1507. |
MLA | Gui, Jie,et al."Feature Selection Based on Structured Sparsity: A Comprehensive Study".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 28.7(2017):1490-1507. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。