中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Image saliency detection via multi-feature and manifold-space ranking

文献类型:会议论文

作者Li XL(李晓莉)1,2,3,4,5,6; Zhao HC(赵怀慈)4; Liu YP(刘云鹏)4
出版日期2021
会议日期January 15-17, 2021
会议地点Bangkok, Thailand
关键词Saliency detection manifold-space ranking multi-feature colorhistogram
页码76-81
英文摘要In this paper, we propose an image saliency detection method by using multi-feature and manifold-space ranking. Basically, the proposed method extracts the color-histogram feature to obtain the fine information of the image, and the color-mean feature to obtain the coarse information respectively. To further improve the detection accuracy of the feature correlation between different image units, a manifold-space ranking method is used to calculate saliency values of image units to construct a saliency map on each feature-space. Specifically, we fuse the two saliency maps to obtain the final saliency map. Extensive experiments demonstrate that the proposed method not only outperforms the other methods, but also improves the accuracy and robustness of the saliency detection.
产权排序1
会议录2021 3rd Asia Pacific Information Technology Conference, APIT 2021
会议录出版者ACM
会议录出版地New York
语种英语
ISBN号978-1-4503-8810-8
源URL[http://ir.sia.cn/handle/173321/28938]  
专题沈阳自动化研究所_光电信息技术研究室
通讯作者Li XL(李晓莉)
作者单位1.The Key Lab of Image Understanding and Computer Vision, Shenyang 110016, China
2.University of Chinese Academy of Sciences, Beijing 100049, China
3.Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang 110169, China
4.Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China
5.Shenyang Jianzhu University, Shenyang 110168, China
6.Key Laboratory of Opto-Electronic Information Processing, Shenyang 110016, China
推荐引用方式
GB/T 7714
Li XL,Zhao HC,Liu YP. Image saliency detection via multi-feature and manifold-space ranking[C]. 见:. Bangkok, Thailand. January 15-17, 2021.

入库方式: OAI收割

来源:沈阳自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。