Rax: Composable Learning-to-Rank using JAX

被引:5
作者
Jagerman, Rolf [1 ]
Wang, Xuanhui [1 ]
Zhuang, Honglei [1 ]
Qin, Zhen [1 ]
Bendersky, Michael [1 ]
Najork, Marc [1 ]
机构
[1] Google Res, Seattle, WA 98105 USA
来源
PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022 | 2022年
关键词
Learning to Rank; JAX;
D O I
10.1145/3534678.3539065
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rax is a library for composable Learning-to-Rank (LTR) written entirely in JAX. The goal of Rax is to facilitate easy prototyping of LTR systems by leveraging the flexibility and simplicity of JAX. Rax provides a diverse set of popular ranking metrics and losses that integrate well with the rest of the JAX ecosystem. Furthermore, Rax implements a system of ranking-specific function transformations which allows fine-grained customization of ranking losses and metrics. Most notably Rax provides approx_t12n: a function transformation (t12n) that can transform any of our ranking metrics into an approximate and differentiable form that can be optimized. This provides a systematic way to directly optimize neural ranking models for ranking metrics that are not easily optimizable in other libraries. We empirically demonstrate the effectiveness of Rax by benchmarking neural models implemented using Flax and trained using Rax on two popular LTR benchmarks: WEB30K and Istella. Furthermore, we show that integrating ranking losses with T5, a large language model, can improve overall ranking performance on the MS MARCO passage ranking task. We are sharing the Rax library with the open source community as part of the larger JAX ecosystem at https://github.com/google/rax.
引用
收藏
页码:3051 / 3060
页数:10
相关论文
共 47 条
[1]  
Abadi M, 2016, PROCEEDINGS OF OSDI'16: 12TH USENIX SYMPOSIUM ON OPERATING SYSTEMS DESIGN AND IMPLEMENTATION, P265
[2]   A General Framework for Counterfactual Learning-to-Rank [J].
Agarwal, Aman ;
Takatsu, Kenta ;
Zaitsev, Ivan ;
Joachims, Thorsten .
PROCEEDINGS OF THE 42ND INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '19), 2019, :5-14
[3]  
[Anonymous], 2008, P 25 INT C MACH LEAR, DOI DOI 10.1039/B716681H
[4]  
Babuschkin I., 2020, The DeepMind JAX Ecosystem
[5]  
Bradbury James., 2018, JAX: composable transformations of Python+NumPy programs
[6]   A Stochastic Treatment of Learning to Rank Scoring Functions [J].
Bruch, Sebastian ;
Han, Shuguang ;
Bendersky, Michael ;
Najork, Marc .
PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, :61-69
[7]  
Burges Chris., 2005, ICML 05 P 22 INT C M, P89
[8]  
Cao H, 2007, PROCEEDINGS OF THE 26TH CHINESE CONTROL CONFERENCE, VOL 4, P129
[9]  
Capannini Gabriele, 2015, IIR
[10]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794