中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
一种基于加权颜色聚合向量的图像检索方法

文献类型:期刊论文

作者徐朝辉 ; 樊银亭 ; 闫奎名 ; 滕东兴 ; 杨海燕
刊名微电子学与计算机
出版日期2014
卷号31期号:2页码:31-34,38
关键词图像检索 颜色直方图 颜色聚合向量 显著图 image retrieval color histogram color coherence vector saliency map
ISSN号1000-7180
其他题名A Method of Image Retrieval Based on Weighted Color Coherence Vector
中文摘要以图像颜色聚合向量为基础,并结合图像显著特征,提出了一种基于加权颜色聚合向量的图像检索方法。首先,提取图像的显著性图,并进行归一化处理,得到加权矩阵;然后,对图像进行颜色聚合向量提取,并根据加权矩阵进行加权处理;最后通过计算两幅图像之间的加权颜色聚合向量相似度,进行图像检索。该方法既系统兼顾了图像的颜色分布特征和高层视觉特征,又具有较高的计算速度;实验结果证明,该算法的检索精度明显高于传统的基于颜色统计特征的检索精度。 By taking visual saliency into consideration ,a new image retrieval method based on color coherence vector is proposed in the paper .According to the saliency map of the image ,every pixel is assigned a weighting value . Thus ,the color coherence vector is calculated on those weighted pixels .This new vector reflects both the low-level color region distribution feature of the image ,and its high-level vision characteristics .The experiments verify this method above is more efficiently than traditional ways based on color histogram .
英文摘要By taking visual saliency into consideration, a new image retrieval method based on color coherence vector is proposed in the paper. According to the saliency map of the image, every pixel is assigned a weighting value. Thus,the color coherence vector is calculated on those weighted pixels. This new vector reflects both the low-level color region distribution feature of the image, and its high-level vision characteristics. The experiments verify this method above is more efficiently than traditional ways based on color histogram.
收录类别CSCD
语种中文
CSCD记录号CSCD:5053453
公开日期2014-12-16
源URL[http://ir.iscas.ac.cn/handle/311060/16768]  
专题软件研究所_软件所图书馆_期刊论文
推荐引用方式
GB/T 7714
徐朝辉,樊银亭,闫奎名,等. 一种基于加权颜色聚合向量的图像检索方法[J]. 微电子学与计算机,2014,31(2):31-34,38.
APA 徐朝辉,樊银亭,闫奎名,滕东兴,&杨海燕.(2014).一种基于加权颜色聚合向量的图像检索方法.微电子学与计算机,31(2),31-34,38.
MLA 徐朝辉,et al."一种基于加权颜色聚合向量的图像检索方法".微电子学与计算机 31.2(2014):31-34,38.

入库方式: OAI收割

来源:软件研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。