A Gated Dilated Convolution with Attention Model for Clinical Cloze-Style Reading Comprehension

被引:11
|
作者
Wang, Bin [1 ]
Zhang, Xuejie [1 ]
Zhou, Xiaobing [1 ]
Li, Junyi [1 ]
机构
[1] Yunnan Univ, Sch Informat Sci & Engn, Kunming 650091, Yunnan, Peoples R China
关键词
clinical medicine; machine reading comprehension; cloze-style; Gated Dilated Convolution; attention mechanism;
D O I
10.3390/ijerph17041323
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The machine comprehension research of clinical medicine has great potential value in practical application, but it has not received sufficient attention and many existing models are very time consuming for the cloze-style machine reading comprehension. In this paper, we study the cloze-style machine reading comprehension in the clinical medical field and propose a Gated Dilated Convolution with Attention (GDCA) model, which consists of a gated dilated convolution module and an attention mechanism. Our model has high parallelism and is capable of capturing long-distance dependencies. On the CliCR data set, our model surpasses the present best model on several metrics and obtains state-of-the-art result, and the training speed is 8 times faster than that of the best model.
引用
收藏
页数:11
相关论文
共 24 条
  • [1] Entity Tracking Improves Cloze-style Reading Comprehension
    Hoang, Luong
    Wiseman, Sam
    Rush, Alexander M.
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 1049 - 1055
  • [2] LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension
    Foolad, Shima
    Kiani, Kourosh
    NEUROCOMPUTING, 2023, 558
  • [3] Clozer: Adaptable Data Augmentation for Cloze-style Reading Comprehension
    Lovenia, Holy
    Wilie, Bryan
    Chung, Willy
    Zeng, Min
    Cahyawijaya, Samuel
    Dan, Su
    Fung, Pascale
    PROCEEDINGS OF THE 7TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2022, : 60 - 66
  • [4] Knowledgeable Reader: Enhancing Cloze-Style Reading Comprehension with External Commonsense Knowledge
    Mihaylov, Todor
    Frank, Anette
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 821 - 832
  • [5] Retrospective Multi-granularity Fusion Network for Chinese Idiom Cloze-style Reading Comprehension
    Yue, Jianyu
    Sun, Yiwen
    Bi, Xiaojun
    Chen, Zheng
    Zhang, Yu
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (07)
  • [6] ATNet: Answering Cloze-Style Questions via Intra-attention and Inter-attention
    Fu, Chengzhen
    Li, Yuntao
    Zhang, Yan
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2019, PT II, 2019, 11440 : 242 - 252
  • [7] A Reading Comprehension Style Question Answering Model Based On Attention Mechanism
    Xiao, Linlong
    Wang, Nanzhi
    Yang, Guocai
    2018 IEEE 29TH INTERNATIONAL CONFERENCE ON APPLICATION-SPECIFIC SYSTEMS, ARCHITECTURES AND PROCESSORS (ASAP), 2018, : 61 - 64
  • [8] Conditional Dilated Convolution Attention Tracking Model
    Highlander, Tyler
    Abayowa, Bernard
    Rizki, Mateen
    Clouse, Hamilton Scott
    2019 THIRD IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING (IRC 2019), 2019, : 453 - 458
  • [9] Multiple Attention Networks with Temporal Convolution for Machine Reading Comprehension
    Guo, Jiabao
    Liu, Gang
    Xiong, Caiquan
    PROCEEDINGS OF 2019 IEEE 9TH INTERNATIONAL CONFERENCE ON ELECTRONICS INFORMATION AND EMERGENCY COMMUNICATION (ICEIEC 2019), 2019, : 546 - 549
  • [10] Machine Reading Comprehension Model Based on Fusion of Mixed Attention
    Wang, Yanfeng
    Ma, Ning
    Guo, Zechen
    APPLIED SCIENCES-BASEL, 2024, 14 (17):