Feature selection for label distribution learning under feature weight view

被引:0
|
作者
Shidong Lin
Chenxi Wang
Yu Mao
Yaojin Lin
机构
[1] Minnan Normal University,School of Computer Science
[2] Minnan Normal University,Key Laboratory of Data Science and Intelligence Application
来源
International Journal of Machine Learning and Cybernetics | 2024年 / 15卷
关键词
Feature selection; Label distribution learning; Feature weight; Mutual information; Label correlation;
D O I
暂无
中图分类号
学科分类号
摘要
Label Distribution Learning (LDL) is a fine-grained learning paradigm that addresses label ambiguity, yet it confronts the curse of dimensionality. Feature selection is an effective method for dimensionality reduction, and several algorithms have been proposed for LDL that tackle the problem from different views. In this paper, we propose a novel feature selection method for LDL. First, an effective LDL model is trained through a classical LDL loss function, which is composed of the maximum entropy model and KL divergence. Then, to select common and label-specific features, their weights are enhanced by l21\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{21}$$\end{document}-norm and label correlation, respectively. Considering that direct constraint on the parameter by label correlation will make the label-specific features between relevant labels tend to be the same, we adopt the strategy of constraining the maximum entropy output model. Finally, the proposed method will introduce Mutual Information (MI) for the first time in the optimization model for LDL feature selection, which distinguishes similar features thus reducing the influence of redundant features. Experimental results on twelve datasets validate the effectiveness of the proposed method.
引用
收藏
页码:1827 / 1840
页数:13
相关论文
共 50 条
  • [31] Partial label feature selection based on noisy manifold and label distribution
    Qian, Wenbin
    Liu, Jiale
    Yang, Wenji
    Huang, Jintao
    Ding, Weiping
    PATTERN RECOGNITION, 2024, 156
  • [32] A Fast Feature Selection Method Based on Mutual Information in Multi-label Learning
    Sun, Zhenqiang
    Zhang, Jia
    Luo, Zhiming
    Cao, Donglin
    Li, Shaozi
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2018, 2019, 917 : 424 - 437
  • [33] Random forest feature selection for partial label learning
    Sun, Xianran
    Chai, Jing
    NEUROCOMPUTING, 2023, 561
  • [34] Manifold regularized discriminative feature selection for multi-label learning
    Zhang, Jia
    Luo, Zhiming
    Li, Candong
    Zhou, Changen
    Li, Shaozi
    PATTERN RECOGNITION, 2019, 95 : 136 - 150
  • [35] Feature relevance and redundancy coefficients for multi-view multi-label feature selection
    Han, Qingqi
    Hu, Liang
    Gao, Wanfu
    INFORMATION SCIENCES, 2024, 652
  • [36] Multi-View Multi-Label Learning With Sparse Feature Selection for Image Annotation
    Zhang, Yongshan
    Wu, Jia
    Cai, Zhihua
    Yu, Philip S.
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (11) : 2844 - 2857
  • [37] Multi-label Learning with Label-Specific Feature Selection
    Yan, Yan
    Li, Shining
    Yang, Zhe
    Zhang, Xiao
    Li, Jing
    Wang, Anyi
    Zhang, Jingyu
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 305 - 315
  • [38] LIA: A Label-Independent Algorithm for Feature Selection for Supervised Learning
    Gilboa-Freedman, Gail
    Patelsky, Alon
    Sheldon, Tal
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, 2019, 11943 : 106 - 117
  • [39] Robust Feature Selection with Feature Correlation via Sparse Multi-Label Learning
    Jiangjiang Cheng
    Junmei Mei
    Jing Zhong
    Min Men
    Ping Zhong
    Pattern Recognition and Image Analysis, 2020, 30 : 52 - 62
  • [40] Multi-label feature selection via feature manifold learning and sparsity regularization
    Zhiling Cai
    William Zhu
    International Journal of Machine Learning and Cybernetics, 2018, 9 : 1321 - 1334