A Dual-Generator Translation Network Fusing Texture and Structure Features for SAR and Optical Image Matching
文献类型:期刊论文
作者 | Nie, Han2; Fu, Zhitao2; Tang, Bo-Hui1,2; Li, Ziqian2; Chen, Sijing2; Wang, Leiguang3 |
刊名 | REMOTE SENSING
![]() |
出版日期 | 2022-06-01 |
卷号 | 14期号:12页码:22 |
关键词 | SAR-to-optical image translation dual-generator texture and structure fusing SAR and optical image matching |
DOI | 10.3390/rs14122946 |
通讯作者 | Fu, Zhitao(zhitaofu@kust.edu.cn) |
英文摘要 | The matching problem for heterologous remote sensing images can be simplified to the matching problem for pseudo homologous remote sensing images via image translation to improve the matching performance. Among such applications, the translation of synthetic aperture radar (SAR) and optical images is the current focus of research. However, the existing methods for SAR-to-optical translation have two main drawbacks. First, single generators usually sacrifice either structure or texture features to balance the model performance and complexity, which often results in textural or structural distortion; second, due to large nonlinear radiation distortions (NRDs) in SAR images, there are still visual differences between the pseudo-optical images generated by current generative adversarial networks (GANs) and real optical images. Therefore, we propose a dual-generator translation network for fusing structure and texture features. On the one hand, the proposed network has dual generators, a texture generator, and a structure generator, with good cross-coupling to obtain high-accuracy structure and texture features; on the other hand, frequency-domain and spatial-domain loss functions are introduced to reduce the differences between pseudo-optical images and real optical images. Extensive quantitative and qualitative experiments show that our method achieves state-of-the-art performance on publicly available optical and SAR datasets. Our method improves the peak signal-to-noise ratio (PSNR) by 21.0%, the chromatic feature similarity (FSIMc) by 6.9%, and the structural similarity (SSIM) by 161.7% in terms of the average metric values on all test images compared with the next best results. In addition, we present a before-and-after translation comparison experiment to show that our method improves the average keypoint repeatability by approximately 111.7% and the matching accuracy by approximately 5.25%. |
WOS关键词 | ADVERSARIAL NETWORKS |
资助项目 | National Natural Science Foundation of China[41961053] ; National Natural Science Foundation of China[31860182] ; Yunnan Fundamental Research Projects[202101AT070102] ; Yunnan Fundamental Research Projects[202101BE070001-037] ; Yunnan Fundamental Research Projects[202201AT070164] |
WOS研究方向 | Environmental Sciences & Ecology ; Geology ; Remote Sensing ; Imaging Science & Photographic Technology |
语种 | 英语 |
WOS记录号 | WOS:000816100700001 |
出版者 | MDPI |
资助机构 | National Natural Science Foundation of China ; Yunnan Fundamental Research Projects |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/180496] ![]() |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Fu, Zhitao |
作者单位 | 1.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, State Key Lab Resources & Environm Informat Syst, Beijing 100101, Peoples R China 2.Kunming Univ Sci & Technol, Fac Land & Resources Engn, Kunming 650031, Yunnan, Peoples R China 3.Southwest Forestry Univ, Inst Big Data & Artificial Intelligence, Kunming 650024, Yunnan, Peoples R China |
推荐引用方式 GB/T 7714 | Nie, Han,Fu, Zhitao,Tang, Bo-Hui,et al. A Dual-Generator Translation Network Fusing Texture and Structure Features for SAR and Optical Image Matching[J]. REMOTE SENSING,2022,14(12):22. |
APA | Nie, Han,Fu, Zhitao,Tang, Bo-Hui,Li, Ziqian,Chen, Sijing,&Wang, Leiguang.(2022).A Dual-Generator Translation Network Fusing Texture and Structure Features for SAR and Optical Image Matching.REMOTE SENSING,14(12),22. |
MLA | Nie, Han,et al."A Dual-Generator Translation Network Fusing Texture and Structure Features for SAR and Optical Image Matching".REMOTE SENSING 14.12(2022):22. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。