Multi-label dimensionality reduction and classification with extreme learning machines

被引:1
|
作者
Lin Feng [1 ,2 ]
Jing Wang [1 ,2 ]
Shenglan Liu [1 ,2 ]
Yao Xiao [1 ,2 ]
机构
[1] Faculty of Electronic Information and Electrical Engineering, School of Computer Science and Technology,Dalian University of Technology
[2] School of Innovation Experiment, Dalian University of Technology
基金
中国国家自然科学基金;
关键词
multi-label; dimensionality reduction; kernel trick; classification;
D O I
暂无
中图分类号
TP181 [自动推理、机器学习];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the need of some real applications, such as text categorization and image classification, the multi-label learning gradually becomes a hot research point in recent years. Much attention has been paid to the research of multi-label classification algorithms. Considering the fact that the high dimensionality of the multi-label datasets may cause the curse of dimensionality and will hamper the classification process, a dimensionality reduction algorithm, named multi-label kernel discriminant analysis(MLKDA), is proposed to reduce the dimensionality of multi-label datasets. MLKDA, with the kernel trick, processes the multi-label integrally and realizes the nonlinear dimensionality reduction with the idea similar with linear discriminant analysis(LDA). In the classification process of multi-label data, the extreme learning machine(ELM) is an efficient algorithm in the premise of good accuracy. MLKDA, combined with ELM, shows a good performance in multi-label learning experiments with several datasets. The experiments on both static data and data stream show that MLKDA outperforms multi-label dimensionality reduction via dependence maximization(MDDM) and multi-label linear discriminant analysis(MLDA) in cases of balanced datasets and stronger correlation between tags, and ELM is also a good choice for multi-label classification.
引用
收藏
页码:502 / 513
页数:12
相关论文
共 50 条
  • [21] Granular ball-based label enhancement for dimensionality reduction in multi-label data
    Wenbin Qian
    Wenyong Ruan
    Yihui Li
    Jintao Huang
    Applied Intelligence, 2023, 53 : 24008 - 24033
  • [22] Improving Multi-Instance Multi-Label Learning by Extreme Learning Machine
    Yin, Ying
    Zhao, Yuhai
    Li, Chengguang
    Zhang, Bin
    APPLIED SCIENCES-BASEL, 2016, 6 (06):
  • [23] Multi-label learning with label-specific feature reduction
    Xu, Suping
    Yang, Xibei
    Yu, Hualong
    Yu, Dong-Jun
    Yang, Jingyu
    Tsang, Eric C. C.
    KNOWLEDGE-BASED SYSTEMS, 2016, 104 : 52 - 61
  • [24] Multi-Label Manifold Learning
    Hou, Peng
    Geng, Xin
    Zhang, Min-Ling
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1680 - 1686
  • [25] Deep Learning with a Rethinking Structure for Multi-label Classification
    Yang, Yao-Yuan
    Lin, Yi-An
    Chu, Hong-Min
    Lin, Hsuan-Tien
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 125 - 140
  • [26] Learning Section Weights for Multi-label Document Classification
    Fard, Maziar Moradi
    Bayod, Paula Sorolla
    Motarjem, Kiomars
    Nejadi, Mohammad Alian
    Akhondi, Saber
    Thorne, Camilo
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, PT II, NLDB 2024, 2024, 14763 : 359 - 366
  • [27] Integration of deep learning model and feature selection for multi-label classification
    Ebrahimi, Hossein
    Majidzadeh, Kambiz
    Gharehchopogh, Farhad Soleimanian
    INTERNATIONAL JOURNAL OF NONLINEAR ANALYSIS AND APPLICATIONS, 2022, 13 (01): : 2871 - 2883
  • [28] Dual dimensionality reduction on instance-level and feature-level for multi-label data
    Li, Haikun
    Fang, Min
    Wang, Peng
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (35): : 24773 - 24782
  • [29] XFL: Naming Functions in Binaries with Extreme Multi-label Learning
    Patrick-Evans, James
    Dannehl, Moritz
    Kinder, Johannes
    2023 IEEE SYMPOSIUM ON SECURITY AND PRIVACY, SP, 2023, : 2375 - 2390
  • [30] Supervised Deep Dictionary Learning for Single Label and Multi-Label Classification
    Singhal, Vanika
    Majumdar, Angshul
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,