Graph-Based Learning via Auto-Grouped Sparse Regularization and Kernelized Extension

被引:39
作者
Fang, Yuqiang [1 ]
Wang, Ruili [2 ]
Dai, Bin [1 ]
Wu, Xindong [3 ,4 ]
机构
[1] Natl Univ Def Technol, Coll Mechatron Engn & Automat, Changsha 410073, Hunan, Peoples R China
[2] Massey Univ, Sch Engn & Adv Technol, Auckland, New Zealand
[3] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei, Peoples R China
[4] Univ Vermont, Dept Comp Sci, Burlington, VT USA
基金
中国国家自然科学基金;
关键词
Graph based learning; sparse representation; spectral embedding; subspace learning; non-negative matrix factorization; VARIABLE SELECTION; FACE RECOGNITION; REPRESENTATION; ILLUMINATION; REGRESSION; FRAMEWORK; SHRINKAGE; ALGORITHM;
D O I
10.1109/TKDE.2014.2312322
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The key task in developing graph-based learning algorithms is constructing an informative graph to express the contextual information of a data manifold. Since traditional graph construction methods are sensitive to noise and less datum-adaptive to changes in density, a new method called '(1)-graph was proposed recently. A graph construction needs to have two important properties: sparsity and locality. The l(1)-graph has a strong sparsity property, but a weak locality property. Thus, we propose a new method of constructing an informative graph using auto-grouped sparse regularization based on the l(1)-graph, which is called as Group Sparse graph (GS-graph). We also show how to efficiently construct a GS-graph in reproducing kernel Hilbert space with the kernel trick. The new methods, the GS-graph and its kernelized version (KGS-graph), have the same noise-insensitive property as that of l(1)-graph and also can successively preserve the properties of sparsity and locality simultaneously. Furthermore, we integrate the proposed graph with several graph-based learning algorithms to demonstrate the effectiveness of our method. The empirical studies on benchmarks show that the proposed methods outperform the l(1)-graph and other traditional graph construction methods in various learning tasks.
引用
收藏
页码:142 / 154
页数:13
相关论文
共 49 条
[1]   A survey of the state of the art in learning the kernels [J].
Abbasnejad, M. Ehsan ;
Ramachandram, Dhanesh ;
Mandava, Rajeswari .
KNOWLEDGE AND INFORMATION SYSTEMS, 2012, 31 (02) :193-221
[2]  
[Anonymous], 1996, Tech. Rep. CUCS-006-96
[3]  
[Anonymous], 1999, NATURE
[4]   Generalized discriminant analysis using a kernel approach [J].
Baudat, G ;
Anouar, FE .
NEURAL COMPUTATION, 2000, 12 (10) :2385-2404
[5]   A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems [J].
Beck, Amir ;
Teboulle, Marc .
SIAM JOURNAL ON IMAGING SCIENCES, 2009, 2 (01) :183-202
[6]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[7]  
Belkin M, 2006, J MACH LEARN RES, V7, P2399
[8]  
Bin X., 2011, P 34 ANN INT ACM SIG, P885
[9]   Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR [J].
Bondell, Howard D. ;
Reich, Brian J. .
BIOMETRICS, 2008, 64 (01) :115-123
[10]   Graph Regularized Nonnegative Matrix Factorization for Data Representation [J].
Cai, Deng ;
He, Xiaofei ;
Han, Jiawei ;
Huang, Thomas S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2011, 33 (08) :1548-1560