Support vector machines based on posteriori probability

被引:4
作者
机构
[1] Intelligent Info. Proc. Open Lab., Inst. of Comp. Technol., Chinese Acad. of Sci.
[2] Lab. of Comp. Syst., Inst. of Automat., Chinese Acad. of Sci.
[3] 2nd Dept., New Star Res. Inst. of Appl. Tech.
来源
| 2005年 / Science Press, Beijing, China卷 / 42期
关键词
Classification; Margin; Maximal margin algorithm; Posterior probability; Support vector machines; Uncertain classification problem;
D O I
10.1360/crad20050203
中图分类号
学科分类号
摘要
To solve uncertain classification problem, an SVM (support vector machine) is trained to behave like a Bayesian optimal classifier based on the training data. The idea is to weigh each unbalanced training sample by a posteriori probability. A whole framework of posteriori probability support vector machine (PPSVM) is presented and SVM is reformulated into PPSVM. The linear separability, margin, optimal hyperplane and soft margin algorithms are discussed. A new optimization problem is obtained and a new definition of support vector is given. In fact, PPSVM is motivated by statistical learning theory and is an extension of regular SVM. An empirical method is also proposed for determining the posteriori probability. Two artificial examples show that PPSVM formulation is reasonable if the class-conditional probability is known, and some real experiments demonstrate that the weighted data cases by some empirical methods can produce better results than regular SVM.
引用
收藏
页码:196 / 202
页数:6
相关论文
empty
未找到相关数据