Robust Adaptive Embedded Label Propagation With Weight Learning for Inductive Classification

被引:59
作者
Zhang, Zhao [1 ,2 ]
Li, Fanzhang [1 ,2 ]
Jia, Lei [1 ,2 ]
Qin, Jie [3 ]
Zhang, Li [1 ,2 ]
Yan, Shuicheng [4 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Suzhou 215006, Peoples R China
[2] Soochow Univ, Joint Int Res Lab Machine Learning & Neuromorph C, Suzhou 215006, Peoples R China
[3] Swiss Fed Inst Technol, Comp Vis Lab, CH-8092 Zurich, Switzerland
[4] Natl Univ Singapore, Dept Elect & Comp Engn, Singapore 117583, Singapore
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Adaptive embedded label propagation; adaptive weight learning; inductive classification; robust l(2; 1)-norm regularization; semi-supervised learning; DIMENSIONALITY REDUCTION; SPARSE REPRESENTATION; FRAMEWORK; NEIGHBORHOOD; ALGORITHM;
D O I
10.1109/TNNLS.2017.2727526
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a robust inductive semi-supervised label prediction model over the embedded representation, termed adaptive embedded label propagation with weight learning (AELP-WL), for classification. AELP-WL offers several properties. First, our method seamlessly integrates the robust adaptive embedded label propagation with adaptive weight learning into a unified framework. By minimizing the reconstruction errors over embedded features and embedded soft labels jointly, our AELP-WL can explicitly ensure the learned weights to be joint optimal for representation and classification, which differs from most existing LP models that perform weight learning separately by an independent step before label prediction. Second, existing models usually precalculate the weights over the original samples that may contain unfavorable features and noise decreasing performance. To this end, our model adds a constraint that decomposes original data into a sparse component encoding embedded noise-removed sparse representations of samples and a sparse error part fitting noise, and then performs the adaptive weight learning over the embedded sparse representations. Third, our AELP-WL computes the projected soft labels by trading-off the manifold smoothness and label fitness errors over the adaptive weights and the embedded representations for enhancing the label estimation power. By including a regressive label approximation error for simultaneous minimization to correlate sample features with the embedded soft labels, the out-of-sample issue is naturally solved. By minimizing the reconstruction errors over features and embedded soft labels, classification error and label approximation error jointly, state-of-the-art results are delivered.
引用
收藏
页码:3388 / 3403
页数:16
相关论文
共 54 条
[1]  
[Anonymous], 2011, P 22 INT JOINT C ART
[2]  
[Anonymous], 2005, SEMISUPERVISED LEARN
[3]  
[Anonymous], 2006, BOOK REV IEEE T NEUR
[4]  
[Anonymous], 1998, The AR Face Database Technical Report 24
[5]  
CVC
[6]  
[Anonymous], 2003, P 20 INT C MACH LEAR
[7]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[8]  
Chang X., 2014, CORR, P1
[9]   Generalized Correntropy for Robust Adaptive Filtering [J].
Chen, Badong ;
Xing, Lei ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (13) :3376-3387
[10]   Convergence of a Fixed-Point Algorithm under Maximum Correntropy Criterion [J].
Chen, Badong ;
Wang, Jianji ;
Zhao, Haiquan ;
Zheng, Nanning ;
Principe, Jose C. .
IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (10) :1723-1727