Two-Stage Label Embedding via Neural Factorization Machine for Multi-Label Classification

被引:0
|
作者
Chen, Chen [1 ,3 ]
Wang, Haobo [1 ,3 ]
Liu, Weiwei [4 ]
Zhao, Xingyuan [1 ,3 ]
Hu, Tianlei [1 ,3 ]
Chen, Gang [2 ,3 ]
机构
[1] Key Lab Big Data Intelligent Comp Zhejiang Prov, Hangzhou, Zhejiang, Peoples R China
[2] Zhejiang Univ, CAD & CG State Key Lab, Hangzhou, Zhejiang, Peoples R China
[3] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Zhejiang, Peoples R China
[4] East China Normal Univ, Sch Comp Sci & Software Engn, Shanghai, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label embedding has been widely used as a method to exploit label dependency with dimension reduction in multilabel classification tasks. However, existing embedding methods intend to extract label correlations directly, and thus they might be easily trapped by complex label hierarchies. To tackle this issue, we propose a novel Two-Stage Label Embedding (TSLE) paradigm that involves Neural Factorization Machine (NFM) to jointly project features and labels into a latent space. In encoding phase, we introduce a Twin Encoding Network (TEN) that digs out pairwise feature and label interactions in the first stage and then efficiently learn higher-order correlations with deep neural networks (DNNs) in the second stage. After the codewords are obtained, a set of hidden layers is applied to recover the output labels in decoding phase. Moreover, we develop a novel learning model by leveraging a max margin encoding loss and a label-correlation aware decoding loss, and we adopt the mini-batch Adam to optimize our learning model. Lastly, we also provide a kernel insight to better understand our proposed TSLE. Extensive experiments on various real-world datasets demonstrate that our proposed model significantly outperforms other state-of-the-art approaches.
引用
收藏
页码:3304 / 3311
页数:8
相关论文
共 50 条
  • [1] Label Embedding for Multi-label Classification Via Dependence Maximization
    Li, Yachong
    Yang, Youlong
    NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1651 - 1674
  • [2] Label Embedding for Multi-label Classification Via Dependence Maximization
    Yachong Li
    Youlong Yang
    Neural Processing Letters, 2020, 52 : 1651 - 1674
  • [3] Addressing class-imbalance in multi-label learning via two-stage multi-label hypernetwork
    Sun, Kai Wei
    Lee, Chong Ho
    NEUROCOMPUTING, 2017, 266 : 375 - 389
  • [4] A Label Embedding Method for Multi-label Classification via Exploiting Local Label Correlations
    Wang, Xidong
    Li, Jun
    Xu, Jianhua
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 168 - 180
  • [5] Group preserving label embedding for multi-label classification
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Kagita, Venkateswara Rao
    PATTERN RECOGNITION, 2019, 90 : 23 - 34
  • [6] A Label Embedding Method via Conditional Covariance Maximization for Multi-label Classification
    Li, Dan
    Li, Yunqian
    Li, Jun
    Xu, Jianhua
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2023, PT II, 2023, 14147 : 393 - 407
  • [7] Multi-label text classification via joint learning from label embedding and label correlation
    Liu, Huiting
    Chen, Geng
    Li, Peipei
    Zhao, Peng
    Wu, Xindong
    NEUROCOMPUTING, 2021, 460 : 385 - 398
  • [8] Cost-sensitive label embedding for multi-label classification
    Kuan-Hao Huang
    Hsuan-Tien Lin
    Machine Learning, 2017, 106 : 1725 - 1746
  • [9] Cost-sensitive label embedding for multi-label classification
    Huang, Kuan-Hao
    Lin, Hsuan-Tien
    MACHINE LEARNING, 2017, 106 (9-10) : 1725 - 1746
  • [10] Multi-label Extreme Learning Machine Based on Label Matrix Factorization
    Li Sihao
    Chen Fucai
    Huang Ruiyang
    Xie Yixi
    2017 IEEE 2ND INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2017, : 665 - 670