Semi-supervised image classification via nonnegative least-squares regression

被引:2
作者
Ren, Wei-Ya [1 ]
Tang, Min [2 ]
Peng, Yang [2 ]
Li, Guo-Hui [2 ]
机构
[1] Chinese Armed Police Force, Officers Coll, Dept Management Sci & Engn, Chengdu 610213, Sichuan, Peoples R China
[2] Natl Univ Def Technol, Coll Informat Syst & Management, Changsha 410073, Hunan, Peoples R China
关键词
Graph construction; Semi-supervised learning; Label propagation; Least-squares regression; Non-negative constraint; DIMENSIONALITY REDUCTION;
D O I
10.1007/s00530-016-0521-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Semi-supervised image classification is widely applied in various pattern recognition tasks. Label propagation, which is a graph-based semi-supervised learning method, is very popular in solving the semi-supervised image classification problem. The most important step in label propagation is graph construction. To improve the quality of the graph, we consider the nonnegative constraint and the noise estimation, which is based on the least-squares regression (LSR). A novel graph construction method named as nonnegative least-squares regression (NLSR) is proposed in this paper. The nonnegative constraint is considered to eliminate subtractive combinations of coefficients and improve the sparsity of the graph. We consider both small Gaussian noise and sparse corrupted noise to improve the robustness of the NLSR. The experimental result shows that the nonnegative constraint is very significant in the NLSR. Weighted version of NLSR (WNLSR) is proposed to further eliminate 'bridge' edges. Local and global consistency (LGC) is considered as the semi-supervised image classification method. The label propagation error rate is regarded as the evaluation criterion. Experiments on image datasets show encouraging results of the proposed algorithm in comparison to the state-of-the-art algorithms in semi-supervised image classification, especially in improving LSR method significantly.
引用
收藏
页码:725 / 738
页数:14
相关论文
共 23 条
[1]  
[Anonymous], P IEEE INT C MACH LE
[2]  
[Anonymous], 2005, SEMISUPERVISED LEARN
[3]  
[Anonymous], 2006, BOOK REV IEEE T NEUR
[4]  
[Anonymous], 2011, P ADV NEUR INF PROC
[5]  
[Anonymous], 2003, P 20 INT C MACH LEAR
[6]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[7]  
Belkin M., 2005, Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics AISTAT 2005, P17
[8]   Robust Principal Component Analysis? [J].
Candes, Emmanuel J. ;
Li, Xiaodong ;
Ma, Yi ;
Wright, John .
JOURNAL OF THE ACM, 2011, 58 (03)
[9]  
Chapelle O., 2005, PMLR, P57
[10]  
Davis J.V., 2007, P 24 INT C MACHINE L, P209, DOI DOI 10.1145/1273496.1273523