Pre-trained Language Model based Ranking in Baidu Search

被引:39
作者
Zou, Lixin [1 ]
Zhang, Shengqiang [1 ]
Cai, Hengyi [1 ]
Ma, Dehong [1 ]
Cheng, Suqi [1 ]
Wang, Shuaiqiang [1 ]
Shi, Daiting [1 ]
Cheng, Zhicong [1 ]
Yin, Dawei [1 ]
机构
[1] Baidu Inc, Beijing, Peoples R China
来源
KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING | 2021年
关键词
Pre-trained Language Model; Learning to Rank;
D O I
10.1145/3447548.3467147
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As the heart of a search engine, the ranking system plays a crucial role in satisfying users' information demands. More recently, neural rankers fine-tuned from pre-trained language models (PLMs) establish state-of-the-art ranking effectiveness. However, it is non-trivial to directly apply these PLM-based rankers to the large-scale web search system due to the following challenging issues: (1) the prohibitively expensive computations of massive neural PLMs, especially for long texts in the web-document, prohibit their deployments in an online ranking system that demands extremely low latency; (2) the discrepancy between existing ranking-agnostic pre-training objectives and the ad-hoc retrieval scenarios that demand comprehensive relevance modeling is another main barrier for improving the online ranking system; (3) a real-world search engine typically involves a committee of ranking components, and thus the compatibility of the individually fine-tuned ranking model is critical for a cooperative ranking system. In this work, we contribute a series of successfully applied techniques in tackling these exposed issues when deploying the stateof-the-art Chinese pre-trained language model, i.e., ERNIE, in the online search engine system. We first articulate a novel practice to cost-efficiently summarize the web document and contextualize the resultant summary content with the query using a cheap yet powerful Pyramid-ERNIE architecture. Then we endow an innovative paradigm to finely exploit the large-scale noisy and biased post-click behavioral data for relevance-oriented pre-training. We also propose a human-anchored fine-tuning strategy tailored for the online ranking system, aiming to stabilize the ranking signals across various online components. Extensive offline and online experimental results show that the proposed techniques significantly boost the search engine's performance.
引用
收藏
页码:4014 / 4022
页数:9
相关论文
共 60 条
[1]  
Aghajanyan Armen, 2020, CoRR
[2]  
[Anonymous], 2019, SIGKDD 19
[3]  
[Anonymous], 2018, EMNLP 18
[4]  
Arumae Kristjan, 2020, EMNLP 20
[5]   Radiatively-driven natural supersymmetry at the LHC [J].
Baer, Howard ;
Barger, Vernon ;
Huang, Peisi ;
Mickelson, Dan ;
Mustafayev, Azar ;
Sreethawong, Warintorn ;
Tata, Xerxes .
JOURNAL OF HIGH ENERGY PHYSICS, 2013, (12)
[6]  
Baevski Alexei, 2019, EMNLP 19
[7]  
Beltagy I., 2020, Longformer: The Long-Document Transformer, V2004, P05150, DOI DOI 10.48550/ARXIV.2004.05150
[8]  
Cao Zhe, 2007, P INT C MACH LEARN, P129
[9]  
Chapelle O., 2009, WWW 09
[10]  
Chapelle Olivier, 2012, TOIS