Multi-task transfer learning for biomedical machine reading comprehension

被引:0
|
作者
Guo, Wenyang [1 ]
Du, Yongping [1 ]
Zhao, Yiliang [1 ]
Ren, Keyan [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
基金
国家重点研发计划;
关键词
biomedical machine reading comprehension; multi-task learning; transfer learning; attention; data augmentation;
D O I
10.1504/IJDMB.2020.107878
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Biomedical machine reading comprehension aims to extract the answer to the given question from complex biomedical passages, which requires the machine to have the ability to process strong comprehension on natural language. Recent progress has made on this task, but still severely restricted by the insufficient training data due to the domain-specific nature. To solve this problem, we propose a hierarchical question-aware context learning model trained by the multi-task transfer learning algorithm, which can capture the interaction between the question and the passage layer by layer, with multi-level embeddings to strengthen the ability of the language representation. The multi-task transfer learning algorithm leverages the advantages of different machine reading comprehension tasks to improve model generalisation and robustness, pre-training on multiple large-scale open-domain data sets and fine-tuning on the target-domain training set. Moreover, data augmentation is also adopted to create new training samples with various expressions. The public biomedical data set collected from PubMed provided by BioASQ is used to evaluate the model performance. The results show that our method is superior to the best recent solution and achieves a new state of the art.
引用
收藏
页码:234 / 250
页数:17
相关论文
共 50 条
  • [21] Machine Reading Comprehension Model in Domain-Transfer Task
    I. S. Rozhkov
    N. V. Loukachevitch
    Lobachevskii Journal of Mathematics, 2023, 44 : 3160 - 3168
  • [22] A multi-task transfer learning method with dictionary learning
    Zheng, Xin
    Lin, Luyue
    Liu, Bo
    Xiao, Yanshan
    Xiong, Xiaoming
    KNOWLEDGE-BASED SYSTEMS, 2020, 191
  • [23] Multilingual multi-task quantum transfer learning
    Buonaiuto, Giuseppe
    Guarasci, Raffaele
    De Pietro, Giuseppe
    Esposito, Massimo
    QUANTUM MACHINE INTELLIGENCE, 2025, 7 (01)
  • [24] Leveraging Multi-task Learning for Biomedical Named Entity Recognition
    Mehmood, Tahir
    Gerevini, Alfonso
    Lavelli, Alberto
    Serina, Ivan
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI*IA 2019, 2019, 11946 : 431 - 444
  • [25] Biomedical Named Entity Recognition Based on Multi-task Learning
    Zhao, Hui
    Zhao, Di
    Meng, Jiana
    Su, Wen
    Mu, Wenxuan
    HEALTH INFORMATION PROCESSING, CHIP 2023, 2023, 1993 : 51 - 65
  • [26] Evolutionary Multi-task Learning for Modular Extremal Learning Machine
    Tang, Zedong
    Gong, Maoguo
    Zhang, Mingyang
    2017 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2017, : 474 - 479
  • [27] Biomedical Argument Mining Based on Sequential Multi-Task Learning
    Si, Jiasheng
    Sun, Liu
    Zhou, Deyu
    Ren, Jie
    Li, Lin
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (02) : 864 - 874
  • [28] RESEARCH OF MULTI-TASK LEARNING BASED ON EXTREME LEARNING MACHINE
    Mao, Wentao
    Xu, Jiucheng
    Zhao, Shengjie
    Tian, Mei
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2013, 21 : 75 - 85
  • [29] Multi-task Learning for Multilingual Neural Machine Translation
    Wang, Yiren
    Zhai, ChengXiang
    Awadalla, Hany Hassan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1022 - 1034
  • [30] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369