LiMe: Linear Methods for Pseudo-Relevance Feedback

被引:9
|
作者
Valcarce, Daniel [1 ]
Parapar, Javier [1 ]
Barreiro, Alvaro [1 ]
机构
[1] Univ A Coruna, Dept Comp Sci, La Coruna, Spain
来源
33RD ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING | 2018年
关键词
Linear methods; pseudo-relevance feedback; query expansion; linear least squares; SELECTION; MODELS; REGRESSION;
D O I
10.1145/3167132.3167207
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Retrieval effectiveness has been traditionally pursued by improving the ranking models and by enriching the pieces of evidence about the information need beyond the original query. A successful method for producing improved rankings consists in expanding the original query. Pseudo-relevance feedback (PRF) has proved to be an effective method for this task in the absence of explicit user's judgements about the initial ranking. This family of techniques obtains expansion terms using the top retrieved documents yielded by the original query. PRF techniques usually exploit the relationship between terms and documents or terms and queries. In this paper, we explore the use of linear methods for pseudo-relevance feedback. We present a novel formulation of the PRF task as a matrix decomposition problem which we called LiMe. This factorisation involves the computation of an inter-term similarity matrix which is used for expanding the original query. We use linear least squares regression with regularisation to solve the proposed decomposition with non-negativity constraints. We compare LiMe on five datasets against strong state-of-the-art baselines for PRF showing that our novel proposal achieves improvements in terms of MAP, nDCG and robustness index.
引用
收藏
页码:678 / 687
页数:10
相关论文
共 50 条
  • [31] Neural Pseudo-Relevance Feedback Models for Sparse and Dense Retrieval
    Wang, Xiao
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 3497 - 3497
  • [32] Using Pseudo-Relevance Feedback to Improve Image Retrieval Results
    Torjmen, Mouna
    Pinel-Sauvagnat, Karen
    Boughanem, Mohand
    ADVANCES IN MULTILINGUAL AND MULTIMODAL INFORMATION RETRIEVAL, 2008, 5152 : 665 - 673
  • [33] Short text expansion and classification based on pseudo-relevance feedback
    Wang, Meng
    Lin, Lan-Fen
    Wang, Feng
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2014, 48 (10): : 1835 - 1842
  • [34] Pseudo-relevance feedback diversification of social image retrieval results
    Boteanu, Bogdan
    Mironica, Ionut
    Ionescu, Bogdan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (09) : 11889 - 11916
  • [35] Multilingual Pseudo-Relevance Feedback: Performance Study of Assisting Languages
    Chinnakotla, Manoj K.
    Raman, Karthik
    Bhattacharyya, Pushpak
    ACL 2010: 48TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2010, : 1346 - 1356
  • [36] EFFECTIVE PSEUDO-RELEVANCE FEEDBACK FOR LANGUAGE MODELING IN SPEECH RECOGNITION
    Chen, Berlin
    Chen, Yi-Wen
    Chen, Kuan-Yu
    Jan, Ea-Ee
    2013 IEEE WORKSHOP ON AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING (ASRU), 2013, : 13 - 18
  • [37] Utilizing Pseudo-Relevance Feedback in Fusion-based Retrieval
    Roitman, Haggai
    PROCEEDINGS OF THE 2018 ACM SIGIR INTERNATIONAL CONFERENCE ON THEORY OF INFORMATION RETRIEVAL (ICTIR'18), 2018, : 203 - 206
  • [38] EFFECTIVE PSEUDO-RELEVANCE FEEDBACK FOR LANGUAGE MODELING IN EXTRACTIVE SPEECH SUMMARIZATION
    Liu, Shih-Hung
    Chen, Kuan-Yu
    Hsieh, Yu-Lun
    Chen, Berlin
    Wang, Hsin-Min
    Yen, Hsu-Chun
    Hsu, Wen-Lian
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [39] Cross-lingual pseudo-relevance feedback using a comparable corpus
    Rogati, M
    Yang, YM
    EVLAUATION OF CROSS-LANGUAGE INFORMATION RETRIEVAL SYSTEMS, 2002, 2406 : 151 - 157
  • [40] Pseudo-relevance feedback based query expansion using boosting algorithm
    Imran Rasheed
    Haider Banka
    Hamaid Mahmood Khan
    Artificial Intelligence Review, 2021, 54 : 6101 - 6124