AAU-Net: Attention-Based Asymmetric U-Net for Subject-Sensitive Hashing of Remote Sensing Images
文献类型:期刊论文
作者 | Ding, Kaimeng1,2; Chen, Shiping3; Wang, Yu4; Liu, Yueming2; Zeng, Yue1; Tian, Jin1 |
刊名 | REMOTE SENSING
![]() |
出版日期 | 2021-12-01 |
卷号 | 13期号:24页码:26 |
关键词 | security of remote sensing images deep learning subject-sensitive hashing integrity authentication perceptual hash U-Net |
DOI | 10.3390/rs13245109 |
通讯作者 | Zeng, Yue(zengy@jit.edu.cn) |
英文摘要 | The prerequisite for the use of remote sensing images is that their security must be guaranteed. As a special subset of perceptual hashing, subject-sensitive hashing overcomes the shortcomings of the existing perceptual hashing that cannot distinguish between "subject-related tampering" and "subject-unrelated tampering" of remote sensing images. However, the existing subject-sensitive hashing still has a large deficiency in robustness. In this paper, we propose a novel attention-based asymmetric U-Net (AAU-Net) for the subject-sensitive hashing of remote sensing (RS) images. Our AAU-Net demonstrates obvious asymmetric structure characteristics, which is important to improve the robustness of features by combining the attention mechanism and the characteristics of subject-sensitive hashing. On the basis of AAU-Net, a subject-sensitive hashing algorithm is developed to integrate the features of various bands of RS images. Our experimental results show that our AAU-Net-based subject-sensitive hashing algorithm is more robust than the existing deep learning models such as Attention U-Net and MUM-Net, and its tampering sensitivity remains at the same level as that of Attention U-Net and MUM-Net. |
WOS关键词 | NETWORK ; SEGMENTATION ; FUSION ; IMPACT ; RIVER |
资助项目 | National Natural Science Foundation of China[41801303] ; National Natural Science Foundation of China[42101428] ; Scientific Research Fund of Jinling Institute of Technology[jit-fhxm-201604] ; Scientific Research Fund of Jinling Institute of Technology[jit-b-201520] ; Qing Lan Project |
WOS研究方向 | Environmental Sciences & Ecology ; Geology ; Remote Sensing ; Imaging Science & Photographic Technology |
语种 | 英语 |
WOS记录号 | WOS:000742132400001 |
出版者 | MDPI |
资助机构 | National Natural Science Foundation of China ; Scientific Research Fund of Jinling Institute of Technology ; Qing Lan Project |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/169672] ![]() |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Zeng, Yue |
作者单位 | 1.Jinling Inst Technol, Nanjing 211169, Peoples R China 2.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, State Key Lab Resource & Environm Informat Syst, Beijing 100101, Peoples R China 3.CSIRO Data61, Sydney, NSW 1710, Australia 4.Changjiang Nanjing Waterway Bur, Nanjing 210011, Peoples R China |
推荐引用方式 GB/T 7714 | Ding, Kaimeng,Chen, Shiping,Wang, Yu,et al. AAU-Net: Attention-Based Asymmetric U-Net for Subject-Sensitive Hashing of Remote Sensing Images[J]. REMOTE SENSING,2021,13(24):26. |
APA | Ding, Kaimeng,Chen, Shiping,Wang, Yu,Liu, Yueming,Zeng, Yue,&Tian, Jin.(2021).AAU-Net: Attention-Based Asymmetric U-Net for Subject-Sensitive Hashing of Remote Sensing Images.REMOTE SENSING,13(24),26. |
MLA | Ding, Kaimeng,et al."AAU-Net: Attention-Based Asymmetric U-Net for Subject-Sensitive Hashing of Remote Sensing Images".REMOTE SENSING 13.24(2021):26. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。