Machine Reading Comprehension Model Based on Fusion of Mixed Attention

被引:0
作者
Wang, Yanfeng [1 ,2 ]
Ma, Ning [1 ,2 ]
Guo, Zechen [1 ,2 ]
机构
[1] Minist Educ, Key Lab Language Cultural Comp, Lanzhou 730030, Peoples R China
[2] Northwest Minzu Univ, Key Lab Chinas Ethn Languages & Intelligent Proc G, Lanzhou 730030, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2024年 / 14卷 / 17期
基金
中国国家自然科学基金;
关键词
machine reading comprehension; hybrid attention mechanism; DuReader2; BERT;
D O I
10.3390/app14177794
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
To address the problems of the insufficient semantic fusion between text and questions and the lack of consideration of global semantic information encountered in machine reading comprehension models, we proposed a machine reading comprehension model called BERT_hybrid based on the BERT and hybrid attention mechanism. In this model, BERT is utilized to separately map the text and questions into the feature space. Through the integration of Bi-LSTM, an attention mechanism, and a self-attention mechanism, the proposed model achieves a comprehensive semantic fusion between text and questions. The probability distribution of answers is computed using Softmax. The experimental results on the public dataset DuReader demonstrate that the proposed model achieves improvements in BLEU-4 and ROUGE-L scores compared to existing models. Furthermore, to validate the effectiveness of the proposed model design, we analyze the factors influencing the model's performance.
引用
收藏
页数:17
相关论文
共 17 条
[1]   Context-aware knowledge selection and reliable model recommendation with ACCORDION [J].
Ahmed, Yasmine ;
Telmer, Cheryl A. ;
Zhou, Gaoxiang ;
Miskov-Zivanov, Natasa .
FRONTIERS IN SYSTEMS BIOLOGY, 2024, 4
[2]  
[Anonymous], 2017, ICLR
[3]   Chinese EMR Named Entity Recognition Using Fused Label Relations Based on Machine Reading Comprehension Framework [J].
Duan, Junwen ;
Liu, Shuyue ;
Liao, Xincheng ;
Gong, Feng ;
Yue, Hailin ;
Wang, Jianxin .
IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2024, 21 (05) :1143-1153
[4]   A T5-based interpretable reading comprehension model with more accurate evidence training [J].
Guan, Boxu ;
Zhu, Xinhua ;
Yuan, Shangbo .
INFORMATION PROCESSING & MANAGEMENT, 2024, 61 (02)
[5]   Application of machine reading comprehension techniques for named entity recognition in materials science [J].
Huang, Zihui ;
He, Liqiang ;
Yang, Yuhang ;
Li, Andi ;
Zhang, Zhiwen ;
Wu, Siwei ;
Wang, Yang ;
He, Yan ;
Liu, Xujie .
JOURNAL OF CHEMINFORMATICS, 2024, 16 (01)
[6]  
Kakulapati V., 2023, J. Sci. Res. Rep, V29, P20, DOI [10.9734/jsrr/2023/v29i121814, DOI 10.9734/JSRR/2023/V29I121814]
[7]  
Lample G, 2019, Arxiv, DOI arXiv:1901.07291
[8]   Event extraction as machine reading comprehension with question-context bridging [J].
Liu, Liu ;
Liu, Ming ;
Liu, Shanshan ;
Ding, Kun .
KNOWLEDGE-BASED SYSTEMS, 2024, 299
[9]   JEEMRC: Joint Event Detection and Extraction via an End-to-End Machine Reading Comprehension Model [J].
Liu, Shanshan ;
Zhang, Sheng ;
Ding, Kun ;
Liu, Liu .
ELECTRONICS, 2024, 13 (10)
[10]   Efficient Machine Reading Comprehension for Health CareApplications:Algorithm Development and Validation of a ContextExtraction Approach [J].
Nguyen, Duy-Anh ;
Lambert, Gavin ;
Kowalczyk, Ryszard ;
McDonald, Rachael ;
Vo, Quoc Bao .
JMIR FORMATIVE RESEARCH, 2024, 8