An Empirical Study of Multi-Task Learning on BERT for Biomedical Text Mining

被引:0
|
作者
Peng, Yifan [1 ]
Chen, Qingyu [1 ]
Lu, Zhiyong [1 ]
机构
[1] NIH, Natl Ctr Biotechnol Informat, Natl Lib Med, Bldg 10, Bethesda, MD 20892 USA
基金
美国国家卫生研究院;
关键词
CORPUS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning (MTh) has achieved remarkable success in natural language processing applications. In this work, we study a multi-task learning model with multiple decoders on varieties of biomedical and clinical natural language processing tasks such as text similarity, relation extraction, named entity recognition, and text inference. Our empirical results demonstrate that the MTL fine-tuned models outperform state-of-the-art transformer models (e.g., BERT and its variants) by 2.0% and 1.3% in biomedical and clinical domains, respectively. Pairwise MTL further demonstrates more details about which tasks can improve or decrease others. This is particularly helpful in the context that researchers are in the hassle of choosing a suitable model for new problems. The code and models are publicly available at https://github.com/ncbi-nlp/bluebert.
引用
收藏
页码:205 / 214
页数:10
相关论文
共 50 条
  • [1] Biomedical Argument Mining Based on Sequential Multi-Task Learning
    Si, Jiasheng
    Sun, Liu
    Zhou, Deyu
    Ren, Jie
    Li, Lin
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (02) : 864 - 874
  • [2] BERT-Based Multi-Task Learning for Aspect-Based Opinion Mining
    Patel, Manil
    Ezeife, C., I
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2021, PT I, 2021, 12923 : 192 - 204
  • [3] A Multi-task Learning Framework for Product Ranking with BERT
    Wu, Xuyang
    Magnani, Alessandro
    Chaidaroon, Suthee
    Puthenputhussery, Ajit
    Liao, Ciya
    Fang, Yi
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 493 - 501
  • [4] Pretraining Financial Language Model with Multi-Task Learning for Financial Text Mining
    Liu Z.
    Liu C.
    Lin W.
    Zhao J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2021, 58 (08): : 1761 - 1772
  • [5] Adversarial Multi-task Learning for Text Classification
    Liu, Pengfei
    Qiu, Xipeng
    Huang, Xuanjing
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1 - 10
  • [6] Generative Multi-Task Learning for Text Classification
    Zhao, Wei
    Gao, Hui
    Chen, Shuhui
    Wang, Nan
    IEEE ACCESS, 2020, 8 : 86380 - 86387
  • [7] MTRec: Multi-Task Learning over BERT for News Recommendation
    Bi, Qiwei
    Li, Jian
    Shang, Lifeng
    Jiang, Xin
    Liu, Qun
    Yang, Hanfang
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2663 - 2669
  • [8] MTBERT-Attention: An Explainable BERT Model based on Multi-Task Learning for Cognitive Text Classification
    Sebbaq, Hanane
    El Faddouli, Nour-Eddine
    SCIENTIFIC AFRICAN, 2023, 21
  • [9] Multi-task Neural Shared Structure Search: A Study Based on Text Mining
    Li, Jiyi
    Fukumoto, Fumiyo
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2021), PT II, 2021, 12682 : 202 - 218
  • [10] QE BERT: Bilingual BERT using Multi-task Learning for Neural Quality Estimation
    Kim, Hyun
    Lim, Joon-Ho
    Kim, Hyunki
    Na, Seung-Hoon
    FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), VOL 3: SHARED TASK PAPERS, DAY 2, 2019, : 85 - 89