Adaptively Weighted k-Tuple Metric Network for Kinship Verification
文献类型:期刊论文
作者 | Huang, Sheng5,6; Lin, Jingkai5; Huangfu, Luwen1,4; Xing, Yun5; Hu, Junlin7; Zeng, Daniel Dajun2,3 |
刊名 | IEEE TRANSACTIONS ON CYBERNETICS |
出版日期 | 2022-04-19 |
页码 | 14 |
ISSN号 | 2168-2267 |
关键词 | Measurement Feature extraction Task analysis Faces Deep learning Convolutional neural networks Genetics Deep learning kinship verification metric learning relation network (RN) triplet loss |
DOI | 10.1109/TCYB.2022.3163707 |
通讯作者 | Huang, Sheng(huangsheng@cqu.edu.cn) |
英文摘要 | Facial image-based kinship verification is a rapidly growing field in computer vision and biometrics. The key to determining whether a pair of facial images has a kin relation is to train a model that can enlarge the margin between the faces that have no kin relation while reducing the distance between faces that have a kin relation. Most existing approaches primarily exploit duplet (i.e., two input samples without cross pair) or triplet (i.e., single negative pair for each positive pair with low-order cross pair) information, omitting discriminative features from multiple negative pairs. These approaches suffer from weak generalizability, resulting in unsatisfactory performance. Inspired by human visual systems that incorporate both low-order and high-order cross-pair information from local and global perspectives, we propose to leverage high-order cross-pair features and develop a novel end-to-end deep learning model called the adaptively weighted k-tuple metric network (AWk-TMN). Our main contributions are three-fold. First, a novel cross-pair metric learning loss based on k-tuplet loss is introduced. It naturally captures both the low-order and high-order discriminative features from multiple negative pairs. Second, an adaptively weighted scheme is formulated to better highlight hard negative examples among multiple negative pairs, leading to enhanced performance. Third, the model utilizes multiple levels of convolutional features and jointly optimizes feature and metric learning to further exploit the low-order and high-order representational power. Extensive experimental results on three popular kinship verification datasets demonstrate the effectiveness of our proposed AWk-TMN approach compared with several state-of-the-art approaches. The source codes and models are released.1 |
WOS关键词 | KIN RECOGNITION SIGNALS ; FACE ; DEEP ; DESCRIPTOR ; FEATURES ; SUBJECT |
资助项目 | National Natural Science Foundation of China[62176030] ; Natural Science Foundation of Chongqing[cstc2021jcyj-msxmX0568] |
WOS研究方向 | Automation & Control Systems ; Computer Science |
语种 | 英语 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
WOS记录号 | WOS:000785746000001 |
资助机构 | National Natural Science Foundation of China ; Natural Science Foundation of Chongqing |
源URL | [http://ir.ia.ac.cn/handle/173211/48403] |
专题 | 自动化研究所_复杂系统管理与控制国家重点实验室_互联网大数据与安全信息学研究中心 |
通讯作者 | Huang, Sheng |
作者单位 | 1.San Diego State Univ, Ctr Human Dynam Mobile Age, San Diego, CA 92182 USA 2.Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China 3.Chinese Acad Sci, Inst Automat, Beijing 100045, Peoples R China 4.San Diego State Univ, Fowler Coll Business, San Diego, CA 92182 USA 5.Chongqing Univ, Sch Big Data & Software Engn, Chongqing 400044, Peoples R China 6.Chongqing Univ, Key Lab Dependable Serv Comp Cyber Phys Soc, Minist Educ, Chongqing 400044, Peoples R China 7.Beihang Univ, Sch Software, Beijing 100191, Peoples R China |
推荐引用方式 GB/T 7714 | Huang, Sheng,Lin, Jingkai,Huangfu, Luwen,et al. Adaptively Weighted k-Tuple Metric Network for Kinship Verification[J]. IEEE TRANSACTIONS ON CYBERNETICS,2022:14. |
APA | Huang, Sheng,Lin, Jingkai,Huangfu, Luwen,Xing, Yun,Hu, Junlin,&Zeng, Daniel Dajun.(2022).Adaptively Weighted k-Tuple Metric Network for Kinship Verification.IEEE TRANSACTIONS ON CYBERNETICS,14. |
MLA | Huang, Sheng,et al."Adaptively Weighted k-Tuple Metric Network for Kinship Verification".IEEE TRANSACTIONS ON CYBERNETICS (2022):14. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。