Relation Extraction via Domain-aware Transfer Learning

被引:19
|
作者
Di, Shimin [1 ]
Shen, Yanyan [2 ]
Chen, Lei [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
来源
KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING | 2019年
基金
美国国家科学基金会;
关键词
Transfer learning; relation extraction; STABILITY;
D O I
10.1145/3292500.3330890
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Relation extraction in knowledge base construction has been researched for the last decades due to its applicability to many problems. Most classical works, such as supervised information extraction [2] and distant supervision [23], focus on how to construct the knowledge base (KB) by utilizing the large number of labels or certain related KBs. However, in many real-world scenarios, the existing methods may not perform well when a new knowledge base is required but only scarce labels or few related KBs available. In this paper, we propose a novel approach called, Relation Extraction via Domain-aware Transfer Learning (ReTrans), to extract relation mentions from a given text corpus by exploring the experience from a large amount of existing KBs which may not be closely related to the target relation. We first propose to initialize the representation of relation mentions from the massive text corpus and update those representations according to existing KBs. Based on the representations of relation mentions, we investigate the contribution of each KB to the target task and propose to select useful KBs for boosting the effectiveness of the proposed approach. Based on selected KBs, we develop a novel domain-aware transfer learning framework to transfer knowledge from source domains to the target domain, aiming to infer the true relation mentions in the unstructured text corpus. Most importantly, we give the stability and generalization bound of ReTrans. Experimental results on the real world datasets well demonstrate that the effectiveness of our approach, which outperforms all the state-of-the-art baselines.
引用
收藏
页码:1348 / 1357
页数:10
相关论文
共 50 条
  • [21] Interactive optimization of relation extraction via knowledge graph representation learning
    Liu Y.
    Ma Y.
    Zhang Y.
    Yu R.
    Zhang Z.
    Meng Y.
    Zhou Z.
    Journal of Visualization, 2024, 27 (2) : 197 - 213
  • [22] DA-Parser: A Pre-trained Domain-aware Parsing Framework for Heterogeneous Log Analysis
    Tao, Shimin
    Liu, Yilun
    Meng, Weibin
    Wang, Jingyu
    Zhao, Yanqing
    Su, Chang
    Tian, Weinan
    Zhang, Min
    Yang, Hao
    Chen, Xun
    2023 IEEE 47TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC, 2023, : 322 - 327
  • [23] Weakly Supervised Domain Adaptation for Aspect Extraction via Multilevel Interaction Transfer
    Liang, Tao
    Wang, Wenya
    Lv, Fengmao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5818 - 5829
  • [24] Transfer Learning via Learning to Transfer
    Wei, Ying
    Zhang, Yu
    Huang, Junzhou
    Yang, Qiang
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [25] Cross-lingual transfer learning for relation extraction using Universal Dependencies
    Taghizadeh, Nasrin
    Faili, Heshaam
    COMPUTER SPEECH AND LANGUAGE, 2022, 71
  • [26] Transfer learning of coverage functions via invariant properties in the fourier domain
    Kuo-Shih Tseng
    Autonomous Robots, 2021, 45 : 519 - 542
  • [27] Graph Transfer Learning via Adversarial Domain Adaptation With Graph Convolution
    Dai, Quanyu
    Wu, Xiao-Ming
    Xiao, Jiaren
    Shen, Xiao
    Wang, Dan
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 4908 - 4922
  • [28] Manifold Criterion Guided Transfer Learning via Intermediate Domain Generation
    Zhang, Lei
    Wang, Shanshan
    Huang, Guang-Bin
    Zuo, Wangmeng
    Yang, Jian
    Zhang, David
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (12) : 3759 - 3773
  • [30] Auxiliary Learning for Relation Extraction
    Lyu, Shengfei
    Cheng, Jin
    Wu, Xingyu
    Cui, Lizhen
    Chen, Huanhuan
    Miao, Chunyan
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (01): : 182 - 191