Boosting with pairwise constraints

被引:13
作者
Zhang, Changshui [1 ]
Cai, Qutang [1 ]
Song, Yangqiu [1 ]
机构
[1] Tsinghua Univ, State Key Lab Intelligent Technol & Syst, Tsinghua Natl Lab Informat Sci & Technol, Dept Automat, Beijing 100084, Peoples R China
关键词
Boosting; Pairwise constraints; Classifier ensemble; Semi-supervised learning; Gradient descent boosting; CLASSIFICATION; ALGORITHMS; FRAMEWORK;
D O I
10.1016/j.neucom.2009.09.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In supervised learning tasks, boosting can combine multiple weak learners into a stronger one. AdaBoost is one of the most popular boosting algorithms, which is widely used and stimulates extensive research efforts in the boosting research community. Different from supervised learning, semi-supervised learning aims to make full use of both labeled and unlabeled data to improve learning performance, and has drawn considerable interests in both research and applications. To harness the power of boosting, it is important and interesting to extend AdaBoost to semi-supervised scenarios. Moreover, in semi-supervised learning, it is believed that incorporating pairwise constraints such as side-information is promising to obtain more satisfiable results. However. how to extend AdaBoost with pairwise constraints remains an open problem. In this paper, we propose a novel framework to solve this problem based on the gradient descent view of boosting. The proposed framework is almost as simple and flexible as AdaBoost, and can be readily applied in the presence of pairwise constraints. We present theoretical results, show possible further extensions, and validate the effectiveness via experiments. (C) 2009 Elsevier B.V. All rights reserved.
引用
收藏
页码:908 / 919
页数:12
相关论文
共 36 条
[1]  
[Anonymous], P 21 INT C MACH LEAR
[2]  
[Anonymous], 2002, P 8 ACM SIGKDD INT C
[3]  
[Anonymous], 2006, BOOK REV IEEE T NEUR
[4]  
[Anonymous], 2004, P 10 ACM SIGKDD INT, DOI DOI 10.1145/1014052.1014062
[5]  
Bartlett PL, 2007, J MACH LEARN RES, V8, P2347
[6]  
Basu S, 2009, CH CRC DATA MIN KNOW, P1
[7]   An empirical comparison of voting classification algorithms: Bagging, boosting, and variants [J].
Bauer, E ;
Kohavi, R .
MACHINE LEARNING, 1999, 36 (1-2) :105-139
[8]  
Belkin M, 2006, J MACH LEARN RES, V7, P2399
[9]  
Bickel PJ, 2006, J MACH LEARN RES, V7, P705
[10]  
Breiman L, 2004, ANN STAT, V32, P1