中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
彩色图像融合客观评价指标研究

文献类型:学位论文

作者逄浩辰
学位类别博士
答辩日期2014-07
授予单位中国科学院大学
导师朱明
学位专业机械电子工程
中文摘要在彩色图像融合领域,目前我们无法得到标准的融合图像作为参考,所以对融合图像的视觉感知以及融合图像中包含的后续处理所需的有用信息进行量化尤为重要。本文在研究彩色图像融合算法的基础上,重点研究了在四元数架构下的彩色与彩色图像融合以及彩色与红外图像融合的客观评价指标,并进行了实验验证。     本文首先对彩色图像融合技术进行了简要的回顾,介绍了几种经典的彩色图像融合算法,以此为研究基础,提出了一种基于四元数小波的多聚焦彩色图像融合算法。该方法在四元数小波变换域对彩色源图像进行建模,针对不同尺度的四元数值小波系数和尺度系数采用不同的融合规则来得到融合图像的多分辨率分析,最后在此基础上再进行逆四元数小波变换,得到融合后的彩色图像。     本文针对彩色图像融合问题,提出了一种基于四元数卷积的无参考彩色图像融合客观评价指标。在四元数的架构下对彩色图像进行数学建模,充分考虑了彩色图像作为一个整体所具有的颜色信息,通过基于四元数的边缘检测模板分别与源图像、融合图像进行四元数卷积运算来得到图像的边缘细节信息,对图像的清晰度及融合得到的有用信息加以度量,从而实现对融合图像的客观评价。     本文针对彩色图像和红外图像融合问题,在基于四元数矩阵奇异值分解的图像结构化比较研究的基础上,对输入到目标融合图像中的边缘强度信息进行测量,平衡了指标灵敏度,构建了一种用于彩色图像与红外图像融合比较的客观评价指标。     本文的研究方法在四元数的架构下,全面考虑了彩色图像的颜色信息,与人类视觉系统的一致性较好。实验结果表明:本文基于四元数小波的融合算法得出的图像融合效果优于已有的经典算法,可以较好地去除融合图像中的模糊现象;而本文所提出的彩色图像融合客观评价方法则较完整地采用了彩色图像的颜色信息,并融入了人眼视觉系统较为敏感的细节信息,评价结果优于目前常用的评价方法,与人眼视觉感知一致性较好,具有良好的稳定性。
英文摘要At present, in the field of color image fusion, it is impossible to obtain standard fused image as reference. Therefore, it is critical to have the ability to quantify the visual perception and other useful information contained for subsequent processing of the fused color image. In this study, based on the research of color image fusion algorithms, within the framework of quaternion theory, we mainly focus on the construction of objective evaluation indexes for both color image fusion and the fusion of color and infared images, which were verified by experiments.     This study starts by reviewing existing color image fusion technology and introduces some of the most commonly adopted color image fusion algorithms.Then, a novel multifocus color image fusion algorithm based on the quaternion wavelet transform (QWT) is proposed. Each source color image is represented in the quaternion wavelet transform domain. The different image fusion rules are applied to the coarse and fine level QWT coefficients to construct a final multiresolution representation. The fused color image is reconstructed by inverse quaternion wavelet transform.     Next, an advanced objective quality assessment method for color image fusion is proposed, which is a no-reference index based on quaternion convolution theory. This method models the color images under quaternion structure and fully utilizes the color information contained in the images as a whole. Detailed peripheral information is extracted through a sophisticated quaternion convolution algorithm using quaternion-valued edge detection template, the original images and the fused image. By measuring the clarity and useful information extracted above, the objective assessment index for the fused color image is established.     As for the fusion of color and infrared images, under the research of image local structural similarity comparison based on the quaternion matrix singular value decomposition, the edge information transferring to the fused image is measured as to balance the sensitivity of the assessment methods, and then, the objective assessment index for the fusion of color and infrared images is constructed based on the above procedure.     In summary, by fully considering the color information contained in color images under quaternion structure, the proposed algorithms are consistent with the human visual system. Results from experiments show that the proposed image fusion method based on quaternion wavelet transform could overcome the image blur problem and outperforms the classical methods. Meanwhile, the proposed objective assessment methods perfom better compared to some of the commonly adopted evaluation methods by providing good consistency to HVS and exhibiting good stability, which is achieved by incorporating both the color information from the images and detailed information sensitive to the human visual system.
语种中文
公开日期2014-08-21
源URL[http://ir.ciomp.ac.cn/handle/181722/41455]  
专题长春光学精密机械与物理研究所_中科院长春光机所知识产出
推荐引用方式
GB/T 7714
逄浩辰. 彩色图像融合客观评价指标研究[D]. 中国科学院大学. 2014.

入库方式: OAI收割

来源:长春光学精密机械与物理研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。