aMV-LSTM: an attention-based model with multiple positional text matching

被引:0
|
作者
Belkacem, Thiziri [1 ]
Dkaki, Taoufiq [2 ]
Moreno, Jose G. [1 ]
Boughanem, Mohand [1 ]
机构
[1] Paul Sabatier Univ, IRIT Lab, Toulouse, France
[2] Jean Jaures Univ, IRIT Lab, Toulouse, France
来源
SAC '19: PROCEEDINGS OF THE 34TH ACM/SIGAPP SYMPOSIUM ON APPLIED COMPUTING | 2019年
关键词
Attention models; positional; text representation; text matching;
D O I
10.1145/3297280.3297355
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Deep models are getting a wide interest in recent NLP and IR stateof- the-art. Among the proposed models, position-based models and attention-based models take into account the word position in the text, in the former, and the importance of a word among other words in the latter. The positional information are some of the important features that help text representation learning. However, the importance of a given word among others in a given text, which is an important aspect in text matching, is not considered in positional features. In this paper, we propose a model that combines position-based representation learning approach with the attention-based weighting process. The latter learns an importance coefficient for each word of the input text. We propose an extension of a position-based model MV-LSTM with an attention layer, allowing a parameterizable architecture. We believe that when the model is aware of both word position and importance, the learned representations will get more relevant features for the matching process. Our model, namely aMV-LSTM, learns the attention based coefficients to weight words of the different input sentences, before computing their position-based representations. Experimental results, in question/ answer matching and question pairs identification tasks, show that the proposed model outperforms the MV-LSTM baseline and several state-of-the-art models.
引用
收藏
页码:788 / 795
页数:8
相关论文
共 50 条
  • [1] AB-LSTM: Attention-based Bidirectional LSTM Model for Scene Text Detection
    Liu, Zhandong
    Zhou, Wengang
    Li, Houqiang
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2019, 15 (04)
  • [2] Text Summarization of Articles Using LSTM and Attention-Based LSTM
    Kumar, Harsh
    Kumar, Gaurav
    Singh, Shaivye
    Paul, Sourav
    MACHINE LEARNING AND AUTONOMOUS SYSTEMS, 2022, 269 : 133 - 145
  • [3] Attention-based LSTM, GRU and CNN for short text classification
    Yu, Shujuan
    Liu, Danlei
    Zhu, Wenfeng
    Zhang, Yun
    Zhao, Shengmei
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 39 (01) : 333 - 340
  • [4] Enhancements of Attention-Based Bidirectional LSTM for Hybrid Automatic Text Summarization
    Jiang, Jiawen
    Zhang, Haiyang
    Dai, Chenxu
    Zhao, Qingjuan
    Feng, Hao
    Ji, Zhanlin
    Ganchev, Ivan
    IEEE ACCESS, 2021, 9 : 123660 - 123671
  • [5] Attention-based Hierarchical LSTM Model for Document Sentiment Classification
    Wang, Bo
    Fan, Binwen
    2018 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE APPLICATIONS AND TECHNOLOGIES (AIAAT 2018), 2018, 435
  • [6] Step Counting with Attention-based LSTM
    Khan, Shehroz S.
    Abedi, Ali
    2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 559 - 566
  • [7] AFMtrack: Attention-Based Feature Matching for Multiple Object Tracking
    Cuong Bui, Duy
    Anh Hoang, Hiep
    Yoo, Myungsik
    IEEE ACCESS, 2024, 12 : 82897 - 82910
  • [8] Attention-Based Bi-LSTM Model for Arabic Depression Classification
    Almars, Abdulqader M.
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 71 (02): : 3091 - 3106
  • [9] Intrusion Detection Using Attention-Based CNN-LSTM Model
    Al-Omar, Ban
    Trabelsi, Zouheir
    ARTIFICIAL INTELLIGENCE APPLICATIONS AND INNOVATIONS, AIAI 2023, PT I, 2023, 675 : 515 - 526
  • [10] An Improved Attention-based Bidirectional LSTM Model for Cyanobacterial Bloom Prediction
    Jianjun Ni
    Ruping Liu
    Guangyi Tang
    Yingjuan Xie
    International Journal of Control, Automation and Systems, 2022, 20 : 3445 - 3455