A bagging SVM to learn from positive and unlabeled examples

被引:201
作者
Mordelet, F. [1 ]
Vert, J. -P. [2 ,3 ,4 ]
机构
[1] Duke Univ, Durham, NC 27708 USA
[2] Mines ParisTech, Ctr Computat Biol, F-77300 Fontainebleau, France
[3] Inst Curie, F-75005 Paris, France
[4] INSERM, U900, F-75005 Paris, France
关键词
PU learning; Bagging; SVM; CLASSIFICATION;
D O I
10.1016/j.patrec.2013.06.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We consider the problem of learning a binary classifier from a training set of positive and unlabeled examples, both in the inductive and in the transductive setting. This problem, often referred to as PU learning, differs from the standard supervised classification problem by the lack of negative examples in the training set. It corresponds to an ubiquitous situation in many applications such as information retrieval or gene ranking, when we have identified a set of data of interest sharing a particular property, and we wish to automatically retrieve additional data sharing the same property among a large and easily available pool of unlabeled data. We propose a new method for PU learning with a conceptually simple implementation based on bootstrap aggregating (bagging) techniques: the algorithm iteratively trains many binary classifiers to discriminate the known positive examples from random subsamples of the unlabeled set, and averages their predictions. We show theoretically and experimentally that the method can match and even outperform the performance of state-of-the-art methods for PU learning, particularly when the number of positive examples is limited and the fraction of negatives among the unlabeled examples is small. The proposed method can also run considerably faster than state-of-the-art methods, particularly when the set of unlabeled examples is large. (C) 2013 Elsevier B.V. All rights reserved.
引用
收藏
页码:201 / 209
页数:9
相关论文
共 31 条
[1]   Gene prioritization through genomic data fusion [J].
Aerts, S ;
Lambrechts, D ;
Maity, S ;
Van Loo, P ;
Coessens, B ;
De Smet, F ;
Tranchevent, LC ;
De Moor, B ;
Marynen, P ;
Hassan, B ;
Carmeliet, P ;
Moreau, Y .
NATURE BIOTECHNOLOGY, 2006, 24 (05) :537-544
[2]  
[Anonymous], 2011, Proceedings of the 14th International Conference on Artificial Intelligence and Statistics, AISTATS 2011
[3]  
Bartlett PL, 2007, J MACH LEARN RES, V8, P775
[4]   Building text classifiers using positive and unlabeled examples [J].
Bing, L ;
Yang, D ;
Li, XL ;
Lee, WS ;
Yu, PS .
THIRD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2003, :179-186
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[7]   LIBSVM: A Library for Support Vector Machines [J].
Chang, Chih-Chung ;
Lin, Chih-Jen .
ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2011, 2 (03)
[8]  
Chapelle Olivier, 2006, IEEE Transactions on Neural Networks, DOI DOI 10.1109/TNN.2009.2015974
[9]   Kernel-based data fusion for gene prioritization [J].
De Bie, Tijl ;
Tranchevent, Leon-Charles ;
Van Oeffelen, Liesbeth M. M. ;
Moreau, Yves .
BIOINFORMATICS, 2007, 23 (13) :I125-I132
[10]   Learning from positive and unlabeled examples [J].
Denis, F ;
Gilleron, R ;
Letouzey, F .
THEORETICAL COMPUTER SCIENCE, 2005, 348 (01) :70-83