中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Multiple similarities based kernel subspace learning for image classification

文献类型:期刊论文

作者Yan, W; Liu, QS; Lu, HQ; Ma, SD; Narayanan, PJ; Nayar, SK; Shum, HY
刊名COMPUTER VISION - ACCV 2006, PT II
出版日期2006
卷号3852页码:244-253
英文摘要In this paper, we propose a new method for image classification, in which matrix based kernel features are designed to capture the multiple similarities between images in different low-level visual cues. Based on the property that dot product kernel can be regarded as a similarity measure, we apply kernel functions to different low-level visual features respectively to measure the similarities between two images, and obtain a kernel feature matrix for each image. In order to deal with the problems of over fitting and numerical computation, a revised version of Two-Dimensional PCA algorithm is developed to learn intrinsic subspace of matrix features for classification. Extensive experiments on the Corel database show the advantage of the proposed method.
WOS标题词Science & Technology ; Technology
类目[WOS]Computer Science, Artificial Intelligence ; Computer Science, Theory & Methods
研究领域[WOS]Computer Science
关键词[WOS]HUMAN FACES
收录类别ISTP ; SCI
语种英语
WOS记录号WOS:000235773200025
公开日期2015-12-24
源URL[http://ir.ia.ac.cn/handle/173211/9229]  
专题自动化研究所_09年以前成果
作者单位Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100080, Peoples R China
推荐引用方式
GB/T 7714
Yan, W,Liu, QS,Lu, HQ,et al. Multiple similarities based kernel subspace learning for image classification[J]. COMPUTER VISION - ACCV 2006, PT II,2006,3852:244-253.
APA Yan, W.,Liu, QS.,Lu, HQ.,Ma, SD.,Narayanan, PJ.,...&Shum, HY.(2006).Multiple similarities based kernel subspace learning for image classification.COMPUTER VISION - ACCV 2006, PT II,3852,244-253.
MLA Yan, W,et al."Multiple similarities based kernel subspace learning for image classification".COMPUTER VISION - ACCV 2006, PT II 3852(2006):244-253.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。