Incorporating Risk-Sensitiveness into Feature Selection for Learning to Rank

被引:10
|
作者
de Sousa, Daniel Xavier [1 ]
Canuto, Sergio Daniel [1 ]
Rosa, Thierson Couto [2 ]
Santos, Wellington [2 ]
Goncalves, Marcos Andre [1 ]
机构
[1] Univ Fed Minas Gerais, DCC, Belo Horizonte, MG, Brazil
[2] UFG, INF, Jatai, Go, Brazil
来源
CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT | 2016年
关键词
Learning to Rank; Feature Selection; Risk-Sensitiveness;
D O I
10.1145/2983323.2983792
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Learning to Rank (L2R) is currently an essential task in basically all types of information systems given the huge and ever increasing amount of data made available. While many solutions have been proposed to improve L2R functions, relatively little attention has been paid to the task of improving the quality of the feature space. L2R strategies usually rely on dense feature representations, which contain noisy or redundant features, increasing the cost of the learning process, without any benefits. Although feature selection (FS) strategies can be applied to reduce dimensionality and noise, side effects of such procedures have been neglected, such as the risk of getting very poor predictions in a few (but important) queries. In this paper we propose multi-objective FS strategies that optimize both aspects at the same time: ranking performance and risk-sensitive evaluation. For this, we approximate the Pareto-optimal set for multi-objective optimization in a new and original application to L2R. Our contributions include novel FS methods for L2R which optimize multiple, potentially conflicting, criteria. In particular, one of the objectives (risk-sensitive evaluation) has never been optimized in the context of FS for L2R before. Our experimental evaluation shows that our proposed methods select features that are more effective (ranking performance) and low-risk than those selected by other state-of-the-art FS methods.
引用
收藏
页码:257 / 266
页数:10
相关论文
共 50 条
  • [31] Feature Selection for Unsupervised Learning
    Adhikary, Jyoti Ranjan
    Murty, M. Narasimha
    NEURAL INFORMATION PROCESSING, ICONIP 2012, PT III, 2012, 7665 : 382 - 389
  • [32] Joint Structural Learning to Rank with Deep Linear Feature Learning
    Zhao, Xueyi
    Li, Xi
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2015, 27 (10) : 2756 - 2769
  • [33] A Comprehensive Review of Feature Selection and Feature Selection Stability in Machine Learning
    Buyukkececi, Mustafa
    Okur, Mehmet Cudi
    GAZI UNIVERSITY JOURNAL OF SCIENCE, 2023, 36 (04): : 1506 - 1520
  • [34] Rank-based univariate feature selection methods on machine learning classifiers for code smell detection
    Shivani Jain
    Anju Saha
    Evolutionary Intelligence, 2022, 15 : 609 - 638
  • [35] A Graph-based Feature Selection Method for Learning to Rank Using Spectral Clustering for Redundancy Minimization and Biased PageRank for Relevance Analysis
    Jen-Yuan Yeh
    Cheng-Jung Tsai
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2022, 19 (01) : 141 - 164
  • [36] Rank-based univariate feature selection methods on machine learning classifiers for code smell detection
    Jain, Shivani
    Saha, Anju
    EVOLUTIONARY INTELLIGENCE, 2022, 15 (01) : 609 - 638
  • [37] MSEs Credit Risk Assessment Model Based on Federated Learning and Feature Selection
    Xu, Zhanyang
    Cheng, Jianchun
    Cheng, Luofei
    Xu, Xiaolong
    Bilal, Muhammad
    CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (03): : 5573 - 5595
  • [38] Weighted rank aggregation based on ranker accuracies for feature selection
    Majid Abdolrazzagh-Nezhad
    Mahdi Kherad
    Soft Computing, 2025, 29 (4) : 1981 - 2001
  • [39] Low-rank structure preserving for unsupervised feature selection
    Zheng, Wei
    Xu, Chunyan
    Yang, Jian
    Gao, Junbin
    Zhu, Fa
    NEUROCOMPUTING, 2018, 314 : 360 - 370
  • [40] Hypergraph expressing low-rank feature selection algorithm
    Yue Fang
    Yangding Li
    Cong Lei
    Yonggang Li
    Xuelian Deng
    Multimedia Tools and Applications, 2018, 77 : 29551 - 29572