Bayesian learning of inverted Dirichlet mixtures for SVM kernels generation

被引:38
作者
Bdiri, Taoufik [1 ]
Bouguila, Nizar [2 ]
机构
[1] Concordia Univ, Elect & Comp Engn Dept ECE, Montreal, PQ H3G 1T7, Canada
[2] Concordia Univ, Concordia Inst Informat Syst Engn CIISE, Montreal, PQ H3G 1T7, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Mixture models; SVM; Hybrid models; Inverted Dirichlet; Bayesian inference; Bayes factor; Model selection; Gibbs sampling; Kernels; Object detection; Image databases; NORMALIZING CONSTANTS; MONTE-CARLO; FIT DATA; MODEL; CONVERGENCE; APPROXIMATIONS; SIMULATION; CHOICE; RATIOS; RATES;
D O I
10.1007/s00521-012-1094-z
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We describe approaches for positive data modeling and classification using both finite inverted Dirichlet mixture models and support vector machines (SVMs). Inverted Dirichlet mixture models are used to tackle an outstanding challenge in SVMs namely the generation of accurate kernels. The kernels generation approaches, grounded on ideas from information theory that we consider, allow the incorporation of data structure and its structural constraints. Inverted Dirichlet mixture models are learned within a principled Bayesian framework using both Gibbs sampler and Metropolis-Hastings for parameter estimation and Bayes factor for model selection (i.e., determining the number of mixture's components). Our Bayesian learning approach uses priors, which we derive by showing that the inverted Dirichlet distribution belongs to the family of exponential distributions, over the model parameters, and then combines these priors with information from the data to build posterior distributions. We illustrate the merits and the effectiveness of the proposed method with two real-world challenging applications namely object detection and visual scenes analysis and classification.
引用
收藏
页码:1443 / 1458
页数:16
相关论文
共 85 条
[1]   Slice sampling for simulation based fitting of spatial data models [J].
Agarwal, DK ;
Gelfand, AE .
STATISTICS AND COMPUTING, 2005, 15 (01) :61-69
[2]  
[Anonymous], P ADV NEUR INF PROC
[3]  
[Anonymous], 2007, LEARN DATA CONCEPTS, DOI DOI 10.1002/9780470140529.CH4.[38]L
[4]  
[Anonymous], ACTA PSYCHOL
[5]  
[Anonymous], 1992, Stat. Sci., DOI DOI 10.1214/SS/1177011143
[6]  
[Anonymous], 2007, P IEEE CVPR
[7]  
[Anonymous], 2007, P 6 ACM INT C IM VID, DOI [DOI 10.1145/1282280.1282340, 10.1145/1282280.1282340]
[8]  
[Anonymous], 200401 SVCLTR U CAL
[9]  
[Anonymous], 2001, The Bayesian choice
[10]  
Barber D., 1996, ADV NEURAL INFORM PR, P340