中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Visible-Infrared Person Re-Identification via Partially Interactive Collaboration

文献类型:期刊论文

作者Zheng, Xiangtao5; Chen, Xiumei2,3,4; Lu, Xiaoqiang1
刊名IEEE TRANSACTIONS ON IMAGE PROCESSING
出版日期2022
卷号31页码:6951-6963
关键词Collaboration Feature extraction Training Federated learning Cameras Task analysis Representation learning Person re-identification cross-modality collaborative learning partially interactive-shared
ISSN号1057-7149;1941-0042
DOI10.1109/TIP.2022.3217697
产权排序1
英文摘要

Visible-infrared person re-identification (VI-ReID) task aims to retrieve the same person between visible and infrared images. VI-ReID is challenging as the images captured by different spectra present large cross-modality discrepancy. Many methods adopt a two-stream network and design additional constraint conditions to extract shared features for different modalities. However, the interaction between the feature extraction processes of different modalities is rarely considered. In this paper, a partially interactive collaboration method is proposed to exploit the complementary information of different modalities to reduce the modality gap for VI-ReID. Specifically, the proposed method is achieved in a partially interactive-shared architecture: collaborative shallow layers and shared deep layers. The collaborative shallow layers consider the interaction between modality-specific features of different modalities, encouraging the feature extraction processes of different modalities constrain each other to enhance feature representations. The shared deep layers further embed the modality-specific features to a common space to endow them the same identity discriminability. To ensure the interactive collaborative learning implement effectively, the conventional loss and collaborative loss are utilized jointly to train the whole network. Extensive experiments on two publicly available VI-ReID datasets verify the superiority of the proposed PIC method. Specifically, the proposed method achieves a rank-1 accuracy of 83.6% and 57.5% on RegDB and SYSU-MM01 datasets, respectively.

语种英语
WOS记录号WOS:000880642200003
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
源URL[http://ir.opt.ac.cn/handle/181661/96239]  
专题西安光学精密机械研究所_光学影像学习与分析中心
通讯作者Lu, Xiaoqiang
作者单位1.Qiyuan Lab, Beijing 100095, Peoples R China
2.Chinese Acad Sci, Xian Inst Opt & Precis Mech, Xian 710119, Peoples R China
3.Xidian Univ, Sch Comp Sci & Technol, Xian 710071, Peoples R China
4.Xidian Univ, Hangzhou Inst Technol, Hangzhou 311200, Peoples R China
5.Chinese Acad Sci, Xian Inst Opt & Precis Mech, Key Lab Spectral Imaging Technol CAS, Xian 710119, Peoples R China
推荐引用方式
GB/T 7714
Zheng, Xiangtao,Chen, Xiumei,Lu, Xiaoqiang. Visible-Infrared Person Re-Identification via Partially Interactive Collaboration[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2022,31:6951-6963.
APA Zheng, Xiangtao,Chen, Xiumei,&Lu, Xiaoqiang.(2022).Visible-Infrared Person Re-Identification via Partially Interactive Collaboration.IEEE TRANSACTIONS ON IMAGE PROCESSING,31,6951-6963.
MLA Zheng, Xiangtao,et al."Visible-Infrared Person Re-Identification via Partially Interactive Collaboration".IEEE TRANSACTIONS ON IMAGE PROCESSING 31(2022):6951-6963.

入库方式: OAI收割

来源:西安光学精密机械研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。