Label Informed Contrastive Pretraining for Node Importance Estimation on Knowledge Graphs
文献类型:期刊论文
作者 | Zhang, Tianyu1; Hou, Chengbin2,3; Jiang, Rui1; Zhang, Xuegong1; Zhou, Chenghu4; Tang, Ke5; Lv, Hairong1,3 |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2024-02-21 |
页码 | 15 |
关键词 | Contrastive pretraining graph neural networks (GNNs) knowledge graphs (KGs) node importance estimation (NIE) |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2024.3363695 |
通讯作者 | Hou, Chengbin(chengbin.hou10@foxmail.com) ; Lv, Hairong(lvhairong@tsinghua.edu.cn) |
英文摘要 | Node importance estimation (NIE) is the task of inferring the importance scores of the nodes in a graph. Due to the availability of richer data and knowledge, recent research interests of NIE have been dedicated to knowledge graphs (KGs) for predicting future or missing node importance scores. Existing state-of-the-art NIE methods train the model by available labels, and they consider every interested node equally before training. However, the nodes with higher importance often require or receive more attention in real-world scenarios, e.g., people may care more about the movies or webpages with higher importance. To this end, we introduce Label Informed ContrAstive Pretraining (LICAP) to the NIE problem for being better aware of the nodes with high importance scores. Specifically, LICAP is a novel type of contrastive learning (CL) framework that aims to fully utilize continuous labels to generate contrastive samples for pretraining embeddings. Considering the NIE problem, LICAP adopts a novel sampling strategy called top nodes preferred hierarchical sampling to first group all interested nodes into a top bin and a nontop bin based on node importance scores, and then divide the nodes within the top bin into several finer bins also based on the scores. The contrastive samples are generated from those bins and are then used to pretrain node embeddings of KGs via a newly proposed predicate-aware graph attention networks (PreGATs), so as to better separate the top nodes from nontop nodes, and distinguish the top nodes within the top bin by keeping the relative order among finer bins. Extensive experiments demonstrate that the LICAP pretrained embeddings can further boost the performance of existing NIE methods and achieve new state-of-the-art performance regarding both regression and ranking metrics. The source code for reproducibility is available at https://github.com/zhangtia16/LICAP. |
资助项目 | National Natural Science Foundation of China |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:001176551300001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Natural Science Foundation of China |
源URL | [http://ir.igsnrr.ac.cn/handle/311030/203427] ![]() |
专题 | 中国科学院地理科学与资源研究所 |
通讯作者 | Hou, Chengbin; Lv, Hairong |
作者单位 | 1.Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Dept Automat, Bioinformat Div,Minist Educ,Key Lab Bioinformat, Beijing 100084, Peoples R China 2.Fuzhou Univ, Sch Comp & Data Sci, Fuzhou 350109, Peoples R China 3.Fuzhou Inst Data Technol, Fuzhou 350200, Peoples R China 4.Chinese Acad Sci, State Key Lab Resources & Environm Informat Syst, Inst Geog Sci & Nat Resources Res, Beijing 100101, Peoples R China 5.Southern Univ Sci & Technol, Dept Comp Sci & Engn, Shenzhen 518055, Peoples R China |
推荐引用方式 GB/T 7714 | Zhang, Tianyu,Hou, Chengbin,Jiang, Rui,et al. Label Informed Contrastive Pretraining for Node Importance Estimation on Knowledge Graphs[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2024:15. |
APA | Zhang, Tianyu.,Hou, Chengbin.,Jiang, Rui.,Zhang, Xuegong.,Zhou, Chenghu.,...&Lv, Hairong.(2024).Label Informed Contrastive Pretraining for Node Importance Estimation on Knowledge Graphs.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,15. |
MLA | Zhang, Tianyu,et al."Label Informed Contrastive Pretraining for Node Importance Estimation on Knowledge Graphs".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2024):15. |
入库方式: OAI收割
来源:地理科学与资源研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。