Interactive Contrastive Learning for Self-Supervised Entity Alignment

被引:17
作者
Zeng, Kaisheng [1 ]
Dong, Zhenhao [2 ]
Hou, Lei [3 ]
Cao, Yixin [4 ]
Hu, Minghao [5 ]
Yu, Jifan [1 ]
Lv, Xin [1 ]
Cao, Lei [1 ]
Wang, Xin [1 ]
Liu, Haozhuang [1 ]
Huang, Yi [6 ]
Feng, Junlan [6 ]
Wan, Jing [2 ]
Li, Juanzi [7 ]
Feng, Ling [7 ]
机构
[1] Tsinghua Univ, Beijing, Peoples R China
[2] Beijing Univ Chem Technol, Beijing, Peoples R China
[3] Tsinghua, BNRist, Dept Comp Sci & Technol, Beijing, Peoples R China
[4] Singapore Management Univ, Singapore, Singapore
[5] Informat Res Ctr Mil Sci, Beijing, Peoples R China
[6] China Mobile Res Inst, Beijing, Peoples R China
[7] Tsinghua Univ, BNRist, CST, Beijing, Peoples R China
来源
PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022 | 2022年
关键词
Knowledge Graph; Entity Alignment; Self-Supervised Learning; Contrastive Learning;
D O I
10.1145/3511808.3557364
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Self-supervised entity alignment (EA) aims to link equivalent entities across different knowledge graphs (KGs) without the use of pre-aligned entity pairs. The current state-of-the-art (SOTA) self-supervised EA approach draws inspiration from contrastive learning, originally designed in computer vision based on instance discrimination and contrastive loss, and suffers from two shortcomings. Firstly, it puts unidirectional emphasis on pushing sampled negative entities far away rather than pulling positively aligned pairs close, as is done in the well-established supervised EA. Secondly, it advocates the minimum information requirement for self-supervised EA, while we argue that self-described KG's side information (e.g., entity name, relation name, entity description) shall preferably be explored to the maximum extent for the self-supervised EA task. In this work, we propose an interactive contrastive learning model for self-supervised EA. It conducts bidirectional contrastive learning via building pseudo-aligned entity pairs as pivots to achieve direct cross-KG information interaction. It further exploits the integration of entity textual and structural information and elaborately designs encoders for better utilization in the self-supervised setting. Experimental results show that our approach outperforms the previous best self-supervised method by a large margin (over 9% Hits@1 absolute improvement on average) and performs on par with previous SOTA supervised counterparts, demonstrating the effectiveness of the interactive contrastive learning for self-supervised EA. The code and data are available at https://github.com/THU-KEG/ICLEA.
引用
收藏
页码:2465 / 2475
页数:11
相关论文
共 50 条
  • [41] Self-supervised contrastive representation learning for semantic segmentation
    Liu B.
    Cai H.
    Wang Y.
    Chen X.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (01): : 125 - 134
  • [42] Grouped Contrastive Learning of Self-Supervised Sentence Representation
    Wang, Qian
    Zhang, Weiqi
    Lei, Tianyi
    Peng, Dezhong
    APPLIED SCIENCES-BASEL, 2023, 13 (17):
  • [43] What makes for uniformity for non-contrastive self-supervised learning?
    YinQuan Wang
    XiaoPeng Zhang
    Qi Tian
    JinHu Lü
    Science China Technological Sciences, 2022, 65 : 2399 - 2408
  • [44] CONTRASTIVE SELF-SUPERVISED LEARNING FOR TEXT-INDEPENDENT SPEAKER VERIFICATION
    Zhang, Haoran
    Zou, Yuexian
    Wang, Helin
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6713 - 6717
  • [45] Self-Supervised Contrastive Learning for Medical Time Series: A Systematic Review
    Liu, Ziyu
    Alavi, Azadeh
    Li, Minyi
    Zhang, Xiang
    SENSORS, 2023, 23 (09)
  • [46] Multiple representation contrastive self-supervised learning for pulmonary nodule detection
    Torki, Asghar
    Adibi, Peyman
    Kashani, Hamidreza Baradaran
    KNOWLEDGE-BASED SYSTEMS, 2024, 301
  • [47] Self-Supervised Triplet Contrastive Learning for Classifying Endometrial Histopathological Images
    Zhao, Fengjun
    Wang, Zhiwei
    Du, Hongyan
    He, Xiaowei
    Cao, Xin
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (12) : 5970 - 5981
  • [48] What makes for uniformity for non-contrastive self-supervised learning?
    Wang YinQuan
    Zhang XiaoPeng
    Tian Qi
    Lu JinHu
    SCIENCE CHINA-TECHNOLOGICAL SCIENCES, 2022, 65 (10) : 2399 - 2408
  • [49] Malicious Domain Detection Based on Self-supervised HGNNs with Contrastive Learning
    Li, Zhiping
    Yuan, Fangfang
    Cao, Cong
    Su, Majing
    Lu, Yuhai
    Liu, Yanbing
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 62 - 73
  • [50] Self-supervised contrastive representation learning for classifying Internet of Things malware
    Wang, Fangwei
    Chen, Yinhe
    Gao, Hongfeng
    Li, Qingru
    Wang, Changguang
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 150