Long text semantic matching model based on BERT and densecomposite network

被引:0
|
作者
Chen Y.-L. [1 ]
Gao Z.-C. [1 ]
Cai X.-D. [2 ]
机构
[1] School of Mechanical and Electrical Engineering, Guilin University of Electronic Technology, Guilin
[2] School of Information and Communication, Guilin University of Electronic Technology, Guilin
来源
Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition) | 2024年 / 54卷 / 01期
关键词
BERT; Bi-LSTM; deep learning; dense composite network; long text semantic matching; TextCNN;
D O I
10.13229/j.cnki.jdxbgxb.20220239
中图分类号
学科分类号
摘要
In the semantic matching of long texts,it is challenging to capture the before-and-after connections and topic information,which often results in poor semantic matching. This paper proposes a long text semantic matching method based on BERT and dense composite network. Through the dense connection of BERT embedding and composite network,the accuracy of long semantic matching is significantly improved. First,the sentence pair is input into the BERT pre-training model,and accurate word vector representation is obtained through iterative feedback,and then high-quality sentence pair semantic information is obtained. Secondly,a dense composite network is designed. Bi-LSTM first obtains the global semantic information of sentence pairs,and then TextCNN extracts and integrates local semantic information to obtain the key features of each sentence and the correspondence between sentence pairs,and the BERT Fusion with the hidden output of Bi-LSTM and the pooled output of TextCNN. Finally,summarizing the association state between networks during the training process can effectively prevent network degradation and enhance the model’s judgment ability. The experimental results show that on the community question answering(CQA)long text dataset,the method in this paper has a significant effect,with an average improvement of 45%. © 2024 Editorial Board of Jilin University. All rights reserved.
引用
收藏
页码:232 / 239
页数:7
相关论文
共 21 条
  • [1] Filice1 Simone, Da San Martinoao Giovanni, Moschitti Alessandro, Et al., SemEval-2017 task 3-learning pairwise patterns in community question answering, Proceedings of the 11th International Workshop on Semantic Evaluation, pp. 326-333, (2017)
  • [2] Wu Guo-shun, Sheng Yi-xuan, Lan Man, Et al., Using traditional and deep learning methods to address community question answering task, Proceedings of the 11th International Workshop on Semantic Evaluation, pp. 365-369, (2017)
  • [3] Feng Wen-zheng, Wu Yu, Wu Wei, Et al., Ranking system with neural matching features for community question answering, Proceedings of the 11th International Workshop on Semantic Evaluation, pp. 280-286, (2017)
  • [4] Yuta Koreeda, Hashito Takuya, Niwa Yoshiki, Et al., Combination of neural similarity features and comment plausibility features, Proceedings of the 11th International Workshop on Semantic Evaluation, pp. 353-359, (2017)
  • [5] Wang Zhi-guo, Hamza Wael, Florian Radu, Bilateral multi-perspective matching for natural language sentences, Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp. 4144-4150, (2017)
  • [6] Tan Chuan-qi, Wei Fu-ru, Wang Wen-hui, Et al., Multiway attention networks for modeling sentence pairs, Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, pp. 4411-4417, (2018)
  • [7] Jan Milan Deriu, Mark Cieliebak, Attention-based convolutional neural network for community question answering, Proceedings of the 11th International Workshop on Semantic Evaluation, pp. 334-338, (2017)
  • [8] Jacob Devlin, Chang Ming-wei, Lee Kenton, Et al., BERT: pre-training of deep bidirectional transformers for language understanding, Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics, pp. 4171-4186, (2019)
  • [9] Chen Yuan, Qiu Xin-ying, Multi-task semantic matching with self-supervised learning, Acta Scien-tiarum Naturalium Universitatis Pekinensis, 58, 1, pp. 83-90, (2022)
  • [10] Nils Reimers, Gurevych Iryna, Sentence BERT: Sentence embeddings using siamese BERTnetworks, Proceedings of the 3rd Workshop on Neural Generation and Translation, pp. 3982-3992, (2019)