Label Distribution Learning Method Based on Low-Rank Representation

被引:0
作者
Liu R. [1 ]
Liu X. [1 ]
Li C. [1 ]
机构
[1] School of Software Engineering, Xi'an Jiaotong University, Xi'an
来源
Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence | 2021年 / 34卷 / 02期
基金
中国国家自然科学基金;
关键词
Label Ambiguity; Label Distribution Learning(LDL); Low-Rank Representation(LRR); Multi-label Learning(MLL); Single-Label Learning;
D O I
10.16451/j.cnki.issn1003-6059.202102006
中图分类号
学科分类号
摘要
Label correlations, noises and corruptions are ignored in label distribution learning algorithms. Aiming at this problem, a label distribution learning method based on low-rank representation(LDL-LRR)is proposed. The base of the feature space is leveraged to represent the sample information, and consequently dimensionality reduction of the data in the original feature space is achieved. To capture the global structure of the data, low-rank representation is transferred to the label space to impose low-rank constraint to the model. Augmented Lagrange method and quasi-Newton method are employed to solve the LRR and objective function, respectively. Finally, the label distribution is predicted by the maximum entropy model. Experiments on 10 datasets show that LDL-LRR produces good performance and stable effect. © 2021, Science Press. All right reserved.
引用
收藏
页码:146 / 156
页数:10
相关论文
共 35 条
[11]  
LIU G C, LIN Z C, YAN S C, Et al., Robust Recovery of Subspace Structures by Low-Rank Representation, IEEE Transactions on Pa-ttern Analysis and Machine Intelligence, 35, 1, pp. 171-184, (2013)
[12]  
LI A, LIU X, CHEN D Y, Et al., Robust Discriminative Feature Subspace Learning Based on Low Rank Representation, Journal of Electronics and Information Technology, 42, 5, pp. 1223-1230, (2020)
[13]  
GENG X, XU N., Label Distribution Learning and Label Enhancement, Scientia Sinica(Informationis), 48, 5, pp. 521-530, (2018)
[14]  
BERGER A L, DELLA PIETRA V J, DELLA PIETRA S A., A Maximum Entropy Approach to Natural Language Processing, Computational Linguistics, 22, 1, pp. 39-71, (1996)
[15]  
YANG X, GAO B B, XING C, Et al., Deep Label Distribution Learning for Apparent Age Estimation, Proc of the IEEE International Conference on Computer Vision, pp. 102-108, (2015)
[16]  
DELLA PIETRA S, DELLA PIETRA V, LAFFERTY J., Inducing Features of Random Fields, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19, 4, pp. 380-393, (1997)
[17]  
NOCEDAL J, WRIGHT S J., Numerical Optimization, (2006)
[18]  
KANUNGO T, MOUNT D M, NETANYAHU N S, Et al., An Efficient k-means Clustering Algorithm: Analysis and Implementation, IEEE Transactions on Pattern Analysis and Machine Intelligence, 24, 7, pp. 881-892, (2002)
[19]  
REN T T, JIA X Y, LI W W, Et al., Label Distribution Learning with Label Correlations via Low-Rank Approximation, Proc of the 28th International Joint Conference on Artificial Intelligence, pp. 3325-3331, (2019)
[20]  
WANG W W, LI X P, FENG X C, Et al., A Survey on Sparse Subspace Clustering, Acta Automatica Sinica, 41, 8, pp. 1373-1384, (2015)