Graph regularized and sparse nonnegative matrix factorization with hard constraints for data representation

被引:51
作者
Sun, Fuming [1 ]
Xu, Meixiang [1 ]
Hu, Xuekao [1 ]
Jiang, Xiaojun [1 ]
机构
[1] Liaoning Univ Technol, Jinzhou 121001, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonnegative matrix factorization; Graph-based regularizer; Sparseness constraints; Label information; PARTS;
D O I
10.1016/j.neucom.2015.01.103
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nonnegative Matrix Factorization (NMF) as a popular technique for finding parts-based, linear representations of nonnegative data has been successfully applied in a wide range of applications. This is because it can provide components with physical meaning and interpretations, which is consistent with the psychological intuition of combining parts to form whole. For practical classification tasks, NMF ignores both the local geometry of data and the discriminative information of different classes. In addition, existing research results demonstrate that leveraging sparseness can greatly enhance the ability of the learning parts. Motivated by these advances aforementioned, we propose a novel matrix decomposition algorithm, called Graph regularized and Sparse Non-negative Matrix Factorization with hard Constraints (GSNMFC). It attempts to find a compact representation of the data so that further learning tasks can be facilitated. The proposed GSNMFC jointly incorporates a graph regularizer and hard prior label information as well as sparseness constraint as additional conditions to uncover the intrinsic geometrical and discriminative structures of the data space. The corresponding update solutions and the convergence proofs for the optimization problem are also given in detail. Experimental results demonstrate the effectiveness of our algorithm in comparison to the state-of-the-art approaches through a set of evaluations. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:233 / 244
页数:12
相关论文
共 29 条
[1]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560
[2]   Convex and Semi-Nonnegative Matrix Factorizations [J].
Ding, Chris ;
Li, Tao ;
Jordan, Michael I. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (01) :45-55
[3]   Optimally sparse representation in general (nonorthogonal) dictionaries via l1 minimization [J].
Donoho, DL ;
Elad, M .
PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2003, 100 (05) :2197-2202
[4]  
Dueck D., 2004, technical report PSI-2004-23
[5]   Smooth Nonnegative Matrix Factorization for Unsupervised Audiovisual Document Structuring [J].
Essid, Slim ;
Fevotte, Cedric .
IEEE TRANSACTIONS ON MULTIMEDIA, 2013, 15 (02) :415-425
[6]  
Hoyer PO, 2004, J MACH LEARN RES, V5, P1457
[7]  
Hoyer PO, 2002, NEURAL NETWORKS FOR SIGNAL PROCESSING XII, PROCEEDINGS, P557, DOI 10.1109/NNSP.2002.1030067
[8]   SNMFCA: Supervised NMF-Based Image Classification and Annotation [J].
Jing, Liping ;
Zhang, Chao ;
Ng, Michael K. .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2012, 21 (11) :4508-4521
[9]   Learning the parts of objects by non-negative matrix factorization [J].
Lee, DD ;
Seung, HS .
NATURE, 1999, 401 (6755) :788-791
[10]  
Lee DD, 2001, ADV NEUR IN, V13, P556