中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Neurons Merging Layer: Towards Progressive Redundancy Reduction for Deep Supervised Hashing

文献类型:会议论文

作者Fu, Chaoyou1,3,4; Song, Liangchen1; Wu, Xiang4; Wang, Guoli1; He, Ran2,3,4
出版日期2019
会议日期2019.8.10
会议地点中国澳门
英文摘要

Deep supervised hashing has become an active topic in information retrieval. It generates hashing bits by the output neurons of a deep hashing network. During binary discretization, there often exists much redundancy between hashing bits that degenerates retrieval performance in terms of both storage and accuracy. This paper proposes a simple yet effective Neurons Merging Layer (NM-Layer) for deep supervised hashing. A graph is constructed to represent the redundancy relationship between hashing bits that is used to guide the learning of a hashing network. Specifically, it is dynamically learned by a novel mechanism defined in our active and frozen phases. According to the learned relationship, the NMLayer merges the redundant neurons together to balance the importance of each output neuron. Moreover, multiple NMLayers are progressively trained for a deep hashing network to learn a more compact hashing code from a long redundant code. Extensive experiments on four datasets demonstrate that our proposed method outperforms state-of-the-art hashing methods.

源URL[http://ir.ia.ac.cn/handle/173211/48688]  
专题自动化研究所_智能感知与计算研究中心
通讯作者He, Ran
作者单位1.Horizon Robotics
2.Center for Excellence in Brain Science and Intelligence Technology, CAS
3.University of Chinese Academy of Sciences
4.NLPR & CRIPAC, Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Fu, Chaoyou,Song, Liangchen,Wu, Xiang,et al. Neurons Merging Layer: Towards Progressive Redundancy Reduction for Deep Supervised Hashing[C]. 见:. 中国澳门. 2019.8.10.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。