Bayesian hybrid generative discriminative learning based on finite Liouville mixture models

被引:50
作者
Bouguila, Nizar [1 ]
机构
[1] Concordia Univ, Fac Engn & Comp Sci, Concordia Inst Informat Syst Engn, Montreal, PQ H3G 2W1, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Liouville family of distributions; Generative models; Discriminative models; Mixture models; SVM; Bayesian inference; Exponential family; Conjugate prior; Gibbs sampling; Bayes factor; Image classification; Texture modeling; DIRICHLET MIXTURE; DISTANCE MEASURES; DIVERGENCE; SCENE; DISTRIBUTIONS; SIMULATION;
D O I
10.1016/j.patcog.2010.12.010
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently hybrid generative discriminative approaches have emerged as an efficient knowledge representation and data classification engine. However, little attention has been devoted to the modeling and classification of non-Gaussian and especially proportional vectors. Our main goal, in this paper, is to discover the true structure of this kind of data by building probabilistic kernels from generative mixture models based on Liouville family, from which we develop the Beta-Liouville distribution, and which includes the well-known Dirichlet as a special case. The Beta-Liouville has a more general covariance structure than the Dirichlet which makes it more practical and useful. Our learning technique is based on a principled purely Bayesian approach which resulted models are used to generate support vector machine (SVM) probabilistic kernels based on information divergence. In particular, we show the existence of closed-form expressions of the Kullback-Leibler and Renyi divergences between two Beta-Liouville distributions and then between two Dirichlet distributions as a special case. Through extensive simulations and a number of experiments involving synthetic data, visual scenes and texture images classification, we demonstrate the effectiveness of the proposed approaches. (C) 2010 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1183 / 1200
页数:18
相关论文
共 93 条
[1]  
Anderson T.W., 1986, STAT ANAL DATA, V2nd, DOI DOI 10.1007/978-94-009-4109-0
[2]  
[Anonymous], P ADV NEUR INF PROC
[3]  
[Anonymous], 1992, Stat. Sci., DOI DOI 10.1214/SS/1177011143
[4]  
[Anonymous], P ADV NEUR INF PROC
[5]  
[Anonymous], 2005, NEURAL NETWORKS PATT
[6]  
[Anonymous], 2000, NATURE STAT LEARNING, DOI DOI 10.1007/978-1-4757-3264-1
[7]  
[Anonymous], 2001, The Bayesian choice
[8]  
[Anonymous], BAYESIAN CORE PRACTI
[9]  
[Anonymous], 2005, P 2005 INT C IM PROC
[10]   Inference in model-based cluster analysis [J].
Bensmail, H ;
Celeux, G ;
Raftery, AE ;
Robert, CP .
STATISTICS AND COMPUTING, 1997, 7 (01) :1-10