Semi-U-Net: A Lightweight Deep Neural Network for Subject-Sensitive Hashing of HRRS Images
文献类型:期刊论文
作者 | Ding, Kaimeng1,4,5; Su, Shoubao4; Xu, Nan3; Jiang, Tingting2 |
刊名 | IEEE ACCESS
![]() |
出版日期 | 2021 |
卷号 | 9期号:N/A页码:60280-60295 |
关键词 | Authentication Feature extraction Remote sensing Neural networks Data mining Cryptography Computational modeling Subject-sensitive hashing lightweight deep neural network integrity authentication HRRS image U-net |
ISSN号 | 2169-3536 |
DOI | 10.1109/ACCESS.2021.3074055 |
英文摘要 | As a special case of perceptual hashing algorithm, subject-sensitive hashing can realize "subject-biased" integrity authentication of high resolution remote sensing (HRRS) images, which overcomes the deficiencies of existing integrity authentication technologies. However, the existing deep neural network for subject-sensitive hashing have disadvantages such as high model complexity and low computational efficiency. In this paper, we propose an efficient and lightweight deep neural network named Semi-U-net to achieve efficient subject-sensitive hashing. The proposed Semi-U-net realizes the lightweight of the network from three aspects: First, considering the general process of perceptual hashing, it adopts a semi-u-shaped structure, which simplify the model structure and prevent the model from extracting too much redundant information to enhance the robustness of the algorithm; Second, the number of model parameters and the computational cost are significantly reduced by using deep separable convolution in the entire asymmetric network; Third, the number of model parameters is further compressed by using the dropout layer several times. The experimental results show that the size of our Semi-U-Net model is only 5.38M, which is only 1/27 of MUM-net and 1/15 of MultiResUnet. The speed of the Semi-U-Net based subject-sensitive hashing algorithm is 88.6 FPS, which is 2.89 times faster than MultiResUnet based algorithm and 2.1 times faster than MUM-net Based Algorithm. FLOPs of Semi-U-net is only 1/28 of MUM-net and 1/16 of MultiResUnet. |
资助项目 | National Natural Science Foundation of China[41801303] ; Funds for Jiangsu Provincial Sci-Tech Innovation Team of Swarm Computing ; Scientific Research Hatch Fund of Jinling Institute of Technology[jitrcyj-201505] ; Scientific Research Hatch Fund of Jinling Institute of Technology[jit-b-201520] ; Scientific Research Hatch Fund of Jinling Institute of Technology[jit-fhxm-201604] ; Scientific Research Hatch Fund of Jinling Institute of Technology[D2020005] ; Qing Lan Project |
WOS研究方向 | Computer Science ; Engineering ; Telecommunications |
语种 | 英语 |
WOS记录号 | WOS:000642761600001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Natural Science Foundation of China ; Funds for Jiangsu Provincial Sci-Tech Innovation Team of Swarm Computing ; Scientific Research Hatch Fund of Jinling Institute of Technology ; Qing Lan Project |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/161846] ![]() |
专题 | 资源与环境信息系统国家重点实验室_外文论文 |
作者单位 | 1.Jinling Inst Technol, Sch Network & Commun Engn, Nanjing 211169, Peoples R China 2.Ericsson Nanjing Commun Co Ltd, Nanjing 211100, Peoples R China 3.Jinling Inst Technol, Sch Intelligent Sci & Control Engn, Nanjing 211169, Peoples R China 4.Jiangsu Key Lab Data Sci & Smart Software, Nanjing 211169, Peoples R China 5.Chinese Acad Sci, Inst Geog Sci & Nat Resources Res, State Key Lab Resource & Environm Informat Syst, Beijing 100101, Peoples R China |
推荐引用方式 GB/T 7714 | Ding, Kaimeng,Su, Shoubao,Xu, Nan,et al. Semi-U-Net: A Lightweight Deep Neural Network for Subject-Sensitive Hashing of HRRS Images[J]. IEEE ACCESS,2021,9(N/A):60280-60295. |
APA | Ding, Kaimeng,Su, Shoubao,Xu, Nan,&Jiang, Tingting.(2021).Semi-U-Net: A Lightweight Deep Neural Network for Subject-Sensitive Hashing of HRRS Images.IEEE ACCESS,9(N/A),60280-60295. |
MLA | Ding, Kaimeng,et al."Semi-U-Net: A Lightweight Deep Neural Network for Subject-Sensitive Hashing of HRRS Images".IEEE ACCESS 9.N/A(2021):60280-60295. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。