Multi-label feature selection with autoencoders and hypergraph learning

被引:0
作者
Tang C.-H. [1 ,2 ]
Zhu Q.-X. [1 ]
Hong C.-Q. [2 ]
Zhu W. [3 ]
机构
[1] School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu
[2] School of Computer and Information Engineering, Xiamen University of Technology, Xiamen
[3] Laboratory of Granular Computing, Minnan Normal University, Zhangzhou
来源
Zidonghua Xuebao/Acta Automatica Sinica | 2016年 / 42卷 / 07期
基金
中国国家自然科学基金;
关键词
Autoencoders; Deep learning; Feature selection; Hypergraph; Multi-label;
D O I
10.16383/j.aas.2016.c150736
中图分类号
学科分类号
摘要
In practical application scenarios, more and more data tend to be assigned with multiple labels and contain much redundant information in the high dimensional feature space. To improve the efficiency and effectiveness of multilabel data mining, multi-label data feature selection has become a hotspot. This paper utilizes denoising autoencoders to obtain a more robust version of multi-label data feature representation. Furthermore, based on hypergraph learning theory, a hypergraph Laplacian matrix corresponding to multi-label data is constructed by fusing the effects of all labels on geometrical relationship among all the samples, and then a projection space with lower dimension is obtained by conducting eigenvalue decomposition of the Laplacian matrix. Experimental results demonstrate the effectiveness and feasibility of the proposed algorithm according to its multi-label data classification performance. Copyright © 2016 Acta Automatica Sinica. All rights reserved.
引用
收藏
页码:1014 / 1021
页数:7
相关论文
共 28 条
[1]  
Zhang Y., Zhou Z.H., Multi-label dimensionality reduction via dependence maximization, Proceedings of the 23rd AAAI Conference on Artificial Intelligence, pp. 1503-1505, (2008)
[2]  
Fu Z.-L., Cost-sensitive ensemble learning algorithm for multi-label classification problems, Acta Automatica Sinica, 40, 6, pp. 1075-1085, (2014)
[3]  
Zhang C.-G., Zhang Y., Zhang X.-H., Normalized dependence maximization multi-label semi-supervised learning method, Acta Automatica Sinica, 41, 9, pp. 1577-1588, (2015)
[4]  
Zhang M.L., Zhang K., Multi-label learning by exploiting label dependency, Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 999-1008, (2010)
[5]  
Hariharan B., Zelnik-Manor L., Vishwanathan S.V.N., Varma M., Large scale max-margin multi-label classification with priors, Proceedings of the 27th International Conference on Machine Learning, pp. 423-430, (2010)
[6]  
Elisseeff A., Weston J., A kernel method for multi-labelled classification, Proleedings of the 2001 Advances in Neural Information Processing Systems 14, pp. 681-687, (2001)
[7]  
Sun L., Ji S.W., Ye J.P., Hypergraph spectral learning for multi-label classification, Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 668-676, (2008)
[8]  
Zhang M.L., Zhou Z.H., A review on multi-label learning algorithms, IEEE Transactions on Knowledge and Data Engineering, 26, 8, pp. 1819-1837, (2014)
[9]  
Gibaja E., Ventura S., A tutorial on multi-label learning, ACM Computing Surveys, 47, 3, (2015)
[10]  
Tian F., Shen X.-K., Large scale web image online annotation by learning label set relevance, Acta Automatica Sinica, 40, 8, pp. 1635-1643, (2014)