Adaptive weighted learning for linear regression problems via Kullback-Leibler divergence

被引:7
作者
Liang, Zhizheng [1 ]
Li, Youfu [2 ]
Xia, ShiXiong [1 ]
机构
[1] China Univ Min & Technol, Sch Comp Sci & Technol, Xuzhou, Peoples R China
[2] City Univ Hong Kong, Dept Mfg Engn & Engn Management, Hong Kong, Hong Kong, Peoples R China
关键词
Linear regression; KL divergence; Weighted learning; Alternative optimization; Image classification; FACE-RECOGNITION; CROSS-VALIDATION; ROBUST; REGULARIZATION; REPRESENTATION; COVARIANCE; MATRIX;
D O I
10.1016/j.patcog.2012.10.017
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose adaptive weighted learning for linear regression problems via the Kullback-Leibler (KL) divergence. The alternative optimization method is used to solve the proposed model. Meanwhile, we theoretically demonstrate that the solution of the optimization algorithm converges to a stationary point of the model. In addition, we also fuse global linear regression and class-oriented linear regression and discuss the problem of parameter selection. Experimental results on face and handwritten numerical character databases show that the proposed method is effective for image classification, particularly for the case that the samples in the training and testing set have different characteristics. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1209 / 1219
页数:11
相关论文
共 44 条