Deep Hierarchical Attention Networks for Text Matching in Information Retrieval

被引:0
|
作者
Song, Meina [1 ]
Liu, Qing [1 ]
Haihong, E. [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Beijing 100876, Peoples R China
来源
PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON INFORMATION SYSTEMS AND COMPUTER AIDED EDUCATION (ICISCAE 2018) | 2018年
关键词
Text matching; information retrieval; deep learning; hierarchical attention; deep neural network;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Text Matching is the task of examining two pieces of texts, such as query and documents, and determining whether they have the same meaning. Text Matching is very important in many NLP tasks, such as document retrieval, question answering, automatic conversation, machine translation, etc. In recent years, there existed some representation-based and interaction-based neural networks which have achieved some improvements. However powerful attention mechanism is rarely used in these models. Inspired by the success of attention in machine translation and document classification, in this paper, we propose a Deep Hierarchical Attention Networks for Text Matching, namely Deep-HAN-Matching. Specifically, Deep-HAN-Matching extracts meaningful matching patterns and rich contextual features hierarchically from words to total document at the query term level using the recurrent neural network and attention mechanism, and finally rank the matching score produced by the fully connected neural network. Experimental results on WikiQA, a popular benchmark dataset for answer sentence selection in question answering, show that our model can significantly outperform traditional retrieval baseline models and some recent deep neural network based matching models.
引用
收藏
页码:476 / 481
页数:6
相关论文
共 50 条
  • [31] Text mining and information retrieval
    Forest, Dominic
    Da Sylva, Lyne
    CANADIAN JOURNAL OF INFORMATION AND LIBRARY SCIENCE-REVUE CANADIENNE DES SCIENCES DE L INFORMATION ET DE BIBLIOTHECONOMIE, 2011, 35 (03): : 217 - 227
  • [32] HIERARCHICAL STORAGE IN INFORMATION RETRIEVAL
    SALASIN, J
    COMMUNICATIONS OF THE ACM, 1973, 16 (05) : 291 - 295
  • [33] On hierarchical multimedia information retrieval
    Jane, Y
    Dillon, T
    Liu, J
    Pissaloux, E
    2001 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL II, PROCEEDINGS, 2001, : 729 - 732
  • [34] Considerations in Evaluation of Deep Hashing Networks for Information Retrieval System
    Kim, Subin
    Choi, Yunseon
    Lee, Byunghan
    2023 20TH INTERNATIONAL SOC DESIGN CONFERENCE, ISOCC, 2023, : 149 - 150
  • [35] DEM: Deep Entity Matching Across Heterogeneous Information Networks
    Kong, Chao
    Chen, Bao-Xiang
    Zhang, Li-Ping
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2020, 35 (04) : 739 - 750
  • [36] Hierarchical attention networks for information extraction from cancer pathology reports
    Gao, Shang
    Young, Michael T.
    Qiu, John X.
    Yoon, Hong-Jun
    Christian, James B.
    Fearn, Paul A.
    Tourassi, Georgia D.
    Ramanthan, Arvind
    JOURNAL OF THE AMERICAN MEDICAL INFORMATICS ASSOCIATION, 2018, 25 (03) : 321 - 330
  • [37] DEM: Deep Entity Matching Across Heterogeneous Information Networks
    Chao Kong
    Bao-Xiang Chen
    Li-Ping Zhang
    Journal of Computer Science and Technology, 2020, 35 : 739 - 750
  • [38] Summarization of Text and Image Captioning in Information Retrieval Using Deep Learning Techniques
    Mahalakshmi, P.
    Fatima, N. Sabiyath
    IEEE ACCESS, 2022, 10 : 18289 - 18297
  • [39] Point to Rectangle Matching for Image Text Retrieval
    Wang, Zheng
    Gao, Zhenwei
    Xu, Xing
    Luo, Yadan
    Yang, Yang
    Shen, Heng Tao
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 4977 - 4986
  • [40] IMRAM: Iterative Matching with Recurrent Attention Memory for Cross-Modal Image-Text Retrieval
    Chen, Hui
    Ding, Guiguang
    Liu, Xudong
    Lin, Zijia
    Liu, Ji
    Han, Jungong
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 12652 - 12660