Relation Extraction via Domain-aware Transfer Learning

被引:19
|
作者
Di, Shimin [1 ]
Shen, Yanyan [2 ]
Chen, Lei [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
来源
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING | 2019年
基金
美国国家科学基金会;
关键词
Transfer learning; relation extraction; STABILITY;
D O I
10.1145/3292500.3330890
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Relation extraction in knowledge base construction has been researched for the last decades due to its applicability to many problems. Most classical works, such as supervised information extraction [2] and distant supervision [23], focus on how to construct the knowledge base (KB) by utilizing the large number of labels or certain related KBs. However, in many real-world scenarios, the existing methods may not perform well when a new knowledge base is required but only scarce labels or few related KBs available. In this paper, we propose a novel approach called, Relation Extraction via Domain-aware Transfer Learning (ReTrans), to extract relation mentions from a given text corpus by exploring the experience from a large amount of existing KBs which may not be closely related to the target relation. We first propose to initialize the representation of relation mentions from the massive text corpus and update those representations according to existing KBs. Based on the representations of relation mentions, we investigate the contribution of each KB to the target task and propose to select useful KBs for boosting the effectiveness of the proposed approach. Based on selected KBs, we develop a novel domain-aware transfer learning framework to transfer knowledge from source domains to the target domain, aiming to infer the true relation mentions in the unstructured text corpus. Most importantly, we give the stability and generalization bound of ReTrans. Experimental results on the real world datasets well demonstrate that the effectiveness of our approach, which outperforms all the state-of-the-art baselines.
引用
收藏
页码:1348 / 1357
页数:10
相关论文
共 50 条
  • [31] REKA: Relation Extraction with Knowledge-Aware Attention
    Wang, Peiyi
    Liu, Hongtao
    Wu, Fangzhao
    Song, Jinduo
    Xu, Hongyan
    Wang, Wenjun
    KNOWLEDGE GRAPH AND SEMANTIC COMPUTING: KNOWLEDGE COMPUTING AND LANGUAGE UNDERSTANDING, 2019, 1134 : 62 - 73
  • [32] Safety reinforcement learning control via transfer learning
    Zhang, Quanqi
    Wu, Chengwei
    Tian, Haoyu
    Gao, Yabin
    Yao, Weiran
    Wu, Ligang
    AUTOMATICA, 2024, 166
  • [33] PROCESS-AWARE PREDICTION OF GEOMETRIC ACCURACY FOR ADDITIVE MANUFACTURING VIA TRANSFER LEARNING
    Lin, Daphne
    Seepersad, Carolyn
    PROCEEDINGS OF ASME 2023 INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, IDETC-CIE2023, VOL 3A, 2023,
  • [34] Joint entity and relation extraction with position-aware attention and relation embedding
    Chen, Tiantian
    Zhou, Lianke
    Wang, Nianbin
    Chen, Xirui
    APPLIED SOFT COMPUTING, 2022, 119
  • [35] Relation Extraction with Proactive Domain Adaptation Strategy
    Zhong, Lingfeng
    Zhu, Yi
    11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 441 - 448
  • [36] A Hybrid Model with Pre-trained Entity-Aware Transformer for Relation Extraction
    Yao, Jinxin
    Zhang, Min
    Wang, Biyang
    Xu, Xianda
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 148 - 160
  • [37] Inductive transfer learning for unlabeled target-domain via hybrid regularization
    Zhuang FuZhen
    Luo Ping
    He Qing
    Shi ZhongZhi
    CHINESE SCIENCE BULLETIN, 2009, 54 (14): : 2470 - 2478
  • [38] Inductive transfer learning for unlabeled target-domain via hybrid regularization
    ZHUANG FuZhen1
    2 Hewlett Packard Labs China
    3 Graduate University of Chinese Academy of Sciences
    Science Bulletin, 2009, (14) : 2471 - 2481
  • [39] REET: Joint Relation Extraction and Entity Typing via Multi-task Learning
    Liu, Hongtao
    Wang, Peiyi
    Wu, Fangzhao
    Jiao, Pengfei
    Wang, Wenjun
    Xie, Xing
    Sun, Yueheng
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 327 - 339
  • [40] Adversarial-Robust Transfer Learning for Medical Imaging via Domain Assimilation
    Cheri, Xiaohui
    Luo, Tie
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT IV, PAKDD 2024, 2024, 14648 : 335 - 349