Multi-task transfer learning for biomedical machine reading comprehension

被引:0
|
作者
Guo, Wenyang [1 ]
Du, Yongping [1 ]
Zhao, Yiliang [1 ]
Ren, Keyan [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
基金
国家重点研发计划;
关键词
biomedical machine reading comprehension; multi-task learning; transfer learning; attention; data augmentation;
D O I
10.1504/IJDMB.2020.107878
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Biomedical machine reading comprehension aims to extract the answer to the given question from complex biomedical passages, which requires the machine to have the ability to process strong comprehension on natural language. Recent progress has made on this task, but still severely restricted by the insufficient training data due to the domain-specific nature. To solve this problem, we propose a hierarchical question-aware context learning model trained by the multi-task transfer learning algorithm, which can capture the interaction between the question and the passage layer by layer, with multi-level embeddings to strengthen the ability of the language representation. The multi-task transfer learning algorithm leverages the advantages of different machine reading comprehension tasks to improve model generalisation and robustness, pre-training on multiple large-scale open-domain data sets and fine-tuning on the target-domain training set. Moreover, data augmentation is also adopted to create new training samples with various expressions. The public biomedical data set collected from PubMed provided by BioASQ is used to evaluate the model performance. The results show that our method is superior to the best recent solution and achieves a new state of the art.
引用
收藏
页码:234 / 250
页数:17
相关论文
共 50 条
  • [1] A Multi-Task Learning Machine Reading Comprehension Model for Noisy Document
    Wu, Zhijing
    Xu, Hua
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13963 - 13964
  • [2] Multi-task Learning with Sample Re-weighting for Machine Reading Comprehension
    Xu, Yichong
    Liu, Xiaodong
    Shen, Yelong
    Liu, Jingjing
    Gao, Jianfeng
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2644 - 2655
  • [3] Improving Machine Reading Comprehension with Multi-Task Learning and Self-Training
    Ouyang, Jianquan
    Fu, Mengen
    MATHEMATICS, 2022, 10 (03)
  • [4] Multi-Task Learning with Generative Adversarial Training for Multi-Passage Machine Reading Comprehension
    Ren, Qiyu
    Cheng, Xiang
    Su, Sen
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8705 - 8712
  • [5] Multi-Passage Machine Reading Comprehension Through Multi-Task Learning and Dual Verification
    Li, Xingyi
    Cheng, Xiang
    Xia, Min
    Ren, Qiyu
    He, Zhaofeng
    Su, Sen
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (10) : 5280 - 5293
  • [6] Multi-task joint training model for machine reading comprehension
    Li, Fangfang
    Shan, Youran
    Mao, Xingliang
    Ren, Xingkai
    Liu, Xiyao
    Zhang, Shichao
    NEUROCOMPUTING, 2022, 488 : 66 - 77
  • [7] Named Entity Recognition via Machine Reading Comprehension: A Multi-Task Learning Approach
    Wang, Yibo
    Zhao, Wenting
    Wan, Yao
    Deng, Zhongfen
    Yu, Philip S.
    13TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING AND THE 3RD CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, IJCNLP-AACL 2023, 2023, : 13 - 19
  • [8] Multi-Task Learning of Japanese How-to Tip Machine Reading Comprehension by a Generative Model
    Wang, Xiaotian
    Li, Tingxuan
    Tamura, Takuya
    Nishida, Shunsuke
    Utsuro, Takehito
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2024, E107D (01) : 125 - 134
  • [9] Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning
    Xia, Jiangnan
    Wu, Chen
    Yan, Ming
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2393 - 2396
  • [10] Japanese How-to Tip Machine Reading Comprehension by Multi-task Learning Based on Generative Model
    Wang, Xiaotian
    Li, Tingxuan
    Tamura, Takuya
    Nishida, Shunsuke
    Zhu, Fuzhu
    Utsuro, Takehito
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, 14102 LNAI : 3 - 14