RDNRnet: A Reconstruction Solution of NDVI Based on SAR and Optical Images by Residual-in-Residual Dense Blocks
文献类型:期刊论文
作者 | Han, Yifei1,2; Huang, Jinliang1; Ling, Feng1; Gao, Xinyi1,2; Cai, Wei3,4; Chi, Hong1,5 |
刊名 | IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
![]() |
出版日期 | 2024 |
卷号 | 62页码:14 |
关键词 | Image reconstruction normalized difference vegetation index (NDVI) residual-in-residual dense block NDVI reconstruction net (RDNRnet) synthetic aperture radar (SAR) |
ISSN号 | 0196-2892 |
DOI | 10.1109/TGRS.2024.3354255 |
通讯作者 | Chi, Hong(chihong@whigg.ac.cn) |
英文摘要 | The reconstruction of the normalized difference vegetation index (NDVI) is a crucial prerequisite for numerous spatiotemporal continuous studies. To address the limitations posed by satellite temporal resolution and challenging atmospheric conditions, the combination of synthetic aperture radar (SAR) and optical images from diverse sources has proven to be effective and widely employed. In this study, we employ the spatial-temporal Savitzky-Golay (STSG) algorithm to rectify MODIS NDVI maps and eliminate interruptions caused by noise. Random forest (RF) and gradient boosting decision trees (GBDTs) serve as a dual filter to select SAR indices with the highest impact on NDVI reconstruction, ensuring that the chosen indices encapsulate the most valuable information. Subsequently, we conducted a series of ablation experiments and developed a deep learning network named residual-in-residual dense block (RRDB) NDVI reconstruction net (RDNRnet). This network effectively mitigates the impacts of MODIS coarse resolution and speckle noises in SAR data. We also evaluate the network performance in reconstructing NDVI across all seasons and land cover types. Our findings highlight that the modified dual-polarimetric SAR vegetation index and the standard deviation (STD) of the vertical-vertical (VV) band are the most crucial SAR indices. The predictions for summer exhibit the highest performance, with a coefficient of determination ( R-2 ) reaching 0.9757. Optimal performances by land cover type are observed in forests, paddy fields, and dry farming fields, all with R-2 values exceeding 0.9580. Our adaptive NDVI reconstruction solution demonstrates robust performance across different data availability scenarios, effectively catering to all seasons and land cover types. |
WOS关键词 | NETWORK |
资助项目 | Joint Funds of the National Natural Science Foundation of China |
WOS研究方向 | Geochemistry & Geophysics ; Engineering ; Remote Sensing ; Imaging Science & Photographic Technology |
语种 | 英语 |
WOS记录号 | WOS:001167008300023 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | Joint Funds of the National Natural Science Foundation of China |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/203424] ![]() |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Chi, Hong |
作者单位 | 1.Chinese Acad Sci, Innovat Acad Precis Measurement Sci & Technol, Key Lab Monitoring & Estimate Environm & Disaster, Wuhan 430071, Peoples R China 2.Univ Chinese Acad Sci, Beijing 100049, Peoples R China 3.Hubei Assoc Surveying & Mapping, Wuhan 430000, Peoples R China 4.Hubei Geomat Technol Grp Stock Co Ltd, Wuhan 430000, Peoples R China 5.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, State Key Lab Resources & Environm Informat Syst, Beijing 100101, Peoples R China |
推荐引用方式 GB/T 7714 | Han, Yifei,Huang, Jinliang,Ling, Feng,et al. RDNRnet: A Reconstruction Solution of NDVI Based on SAR and Optical Images by Residual-in-Residual Dense Blocks[J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING,2024,62:14. |
APA | Han, Yifei,Huang, Jinliang,Ling, Feng,Gao, Xinyi,Cai, Wei,&Chi, Hong.(2024).RDNRnet: A Reconstruction Solution of NDVI Based on SAR and Optical Images by Residual-in-Residual Dense Blocks.IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING,62,14. |
MLA | Han, Yifei,et al."RDNRnet: A Reconstruction Solution of NDVI Based on SAR and Optical Images by Residual-in-Residual Dense Blocks".IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 62(2024):14. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。