Semi-supervised label enhancement via structured semantic extraction

被引:5
作者
Wen, Tao [1 ,2 ]
Li, Weiwei [3 ]
Chen, Lei [4 ]
Jia, Xiuyi [1 ,2 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Computat Intelligence, Chongqing 400065, Peoples R China
[3] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
[4] Nanjing Univ Posts & Telecommun, Sch Comp Sci, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Label enhancement; Semi-supervised learning; Label distribution learning; Semantic extraction; INFORMATION; ALGORITHM;
D O I
10.1007/s13042-021-01439-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Label enhancement (LE) is a process of recovering the label distribution from logical labels in the datasets, the goal of which is to better express the label ambiguity through the form of label distribution. Existing LE work mainly focus on exploring the data distribution in the feature space based on complete features and complete logical labels. However, it is not always easy to obtain multi-label datasets with logical labels for all samples in real world, most of datasets have only a few samples with annotated labels. To this end, we propose a novel semi-supervised label enhancement method via structured semantic extraction (SLE-SSE), which can recover the complete label distribution from only a few logical labels. Firstly, we extract self-semantic of samples by expressing inherent ambiguity of each sample in the input space appropriately, and fill in the missing labels based on this kind of information. Secondly, we take advantage of low rank representation to extract the inter-semantics of between samples and between labels, respectively. Finally, we apply a simple but effective linear model to recover the complete label distribution by utilizing the structured semantic information including intra-sample, inter-sample and inter-label based information. Extensive comparative experiments validate the effectiveness of the proposed method.
引用
收藏
页码:1131 / 1144
页数:14
相关论文
共 26 条
[1]   Distributed optimization and statistical learning via the alternating direction method of multipliers [J].
Boyd S. ;
Parikh N. ;
Chu E. ;
Peleato B. ;
Eckstein J. .
Foundations and Trends in Machine Learning, 2010, 3 (01) :1-122
[2]   A SINGULAR VALUE THRESHOLDING ALGORITHM FOR MATRIX COMPLETION [J].
Cai, Jian-Feng ;
Candes, Emmanuel J. ;
Shen, Zuowei .
SIAM JOURNAL ON OPTIMIZATION, 2010, 20 (04) :1956-1982
[3]   Signal recovery by proximal forward-backward splitting [J].
Combettes, PL ;
Wajs, VR .
MULTISCALE MODELING & SIMULATION, 2005, 4 (04) :1168-1200
[4]  
Edgar G., 2007, MEASURE TOPOLOGY FRA
[5]   Label Distribution Learning [J].
Geng, Xin .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (07) :1734-1748
[6]  
Hou P, 2016, AAAI CONF ARTIF INTE, P1680
[7]   Joint Label-Specific Features and Correlation Information for Multi-Label Learning [J].
Jia, Xiu-Yi ;
Zhu, Sai-Sai ;
Li, Wei-Wei .
JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2020, 35 (02) :247-258
[8]   Label Distribution Learning with Label Correlations on Local Samples [J].
Jia, Xiuyi ;
Li, Zechao ;
Zheng, Xiang ;
Li, Weiwei ;
Huang, Sheng-Jun .
IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2021, 33 (04) :1619-1631
[9]   Weakly supervised label distribution learning based on transductive matrix completion with sample correlations [J].
Jia, Xiuyi ;
Ren, Tingting ;
Chen, Lei ;
Wang, Jun ;
Zhu, Jihua ;
Long, Xianzhong .
PATTERN RECOGNITION LETTERS, 2019, 125 :453-462
[10]   Deep-LIFT: Deep Label-Specific Feature Learning for Image Annotation [J].
Li, Junbing ;
Zhang, Changqing ;
Zhou, Joey Tianyi ;
Fu, Huazhu ;
Xia, Shuyin ;
Hu, Qinghua .
IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (08) :7732-7741