Improving Low-Resource Chinese Event Detection with Multi-task Learning

被引:0
|
作者
Tong, Meihan [1 ,2 ]
Xu, Bin [1 ,2 ]
Wang, Shuai [3 ]
Hou, Lei [1 ,2 ]
Li, Juaizi [1 ,2 ]
机构
[1] Beijing Natl Res Ctr Informat Sci & Technol, Tsinghua Univ, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Inst Artificial Intelligence, Knowledge Intelligence Res Ctr, Beijing 100084, Peoples R China
[3] JOYY Inc, Dept Technol, SLP Grp, Beijing, Peoples R China
来源
KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I | 2020年 / 12274卷
关键词
Chinese Event Detection; Multi-task learning; Lattice LSTM;
D O I
10.1007/978-3-030-55130-8_37
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Chinese Event Detection (CED) aims to detect events from unstructured sentences. Due to the difficulty of labeling event detection datasets, previous approaches suffer from severe data sparsity problem. To address this issue, we propose a novel Lattice LSTM based multi-task learning model. On one hand, we utilize multi-granularity word information via Lattice LSTM to fully exploit existing datasets. On the other hand, we employ the multi-task learning mechanism to improve CED with datasets from other tasks. Specifically, we combine Name Entity Recognition (NER) and Mask Word Prediction (MWP) as two auxiliary tasks to learn both entity and general language information. Experiments show that our approach outperforms the six SOTA methods by 1.9% on ACE2005 benchmark. The source code is released on https://github.com/tongmeihan1995/MLL-chinese-event-detection.
引用
收藏
页码:421 / 433
页数:13
相关论文
共 50 条
  • [41] Powering Multi-Task Federated Learning with Competitive GPU Resource Sharing
    Yu, Yongbo
    Yu, Fuxun
    Xu, Zirui
    Wang, Di
    Zhang, Mingjia
    Li, Ang
    Bray, Shawn
    Liu, Chenchen
    Chen, Xiang
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 567 - 571
  • [42] An overview of multi-task learning
    Zhang, Yu
    Yang, Qiang
    NATIONAL SCIENCE REVIEW, 2018, 5 (01) : 30 - 43
  • [43] Boosted multi-task learning
    Olivier Chapelle
    Pannagadatta Shivaswamy
    Srinivas Vadrevu
    Kilian Weinberger
    Ya Zhang
    Belle Tseng
    Machine Learning, 2011, 85 : 149 - 173
  • [44] Calibrated Multi-Task Learning
    Nie, Feiping
    Hu, Zhanxuan
    Li, Xuelong
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 2012 - 2021
  • [45] Survey of Multi-Task Learning
    Zhang Y.
    Liu J.-W.
    Zuo X.
    1600, Science Press (43): : 1340 - 1378
  • [46] A Survey on Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (12) : 5586 - 5609
  • [47] Improving Weakly Supervised Lesion Segmentation using Multi-Task Learning
    Chu, Tianshu
    Li, Xinmeng
    Vo, Huy V.
    Summers, Ronald M.
    Sizikova, Elena
    MEDICAL IMAGING WITH DEEP LEARNING, VOL 143, 2021, 143 : 60 - 73
  • [48] Boosted multi-task learning
    Chapelle, Olivier
    Shivaswamy, Pannagadatta
    Vadrevu, Srinivas
    Weinberger, Kilian
    Zhang, Ya
    Tseng, Belle
    MACHINE LEARNING, 2011, 85 (1-2) : 149 - 173
  • [49] Parallel Multi-Task Learning
    Zhang, Yu
    2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2015, : 629 - 638
  • [50] An overview of multi-task learning
    Yu Zhang
    Qiang Yang
    National Science Review, 2018, 5 (01) : 30 - 43