A general procedure for learning mixtures of independent component analyzers

被引:33
作者
Salazar, Addisson [1 ]
Vergara, Luis [1 ]
Serrano, Arturo [1 ]
Igual, Jorge [1 ]
机构
[1] Univ Politecn Valencia, Signal Proc Grp GTS, Inst Telecommun & Multimedia Applicat ITEAM, Valencia 46022, Spain
关键词
ICA mixture model; ICA; BSS; Non-parametric density estimation; Semi-supervised learning; BLIND SOURCE SEPARATION; UNSUPERVISED CLASSIFICATION; ICA; MODEL; ALGORITHMS;
D O I
10.1016/j.patcog.2009.05.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a new procedure for learning mixtures of independent component analyzers. The procedure includes non-parametric estimation of the source densities, supervised-unsupervised learning of the model parameters, incorporation of any independent component analysis (ICA) algorithm into the learning of the ICA mixtures, and estimation of residual dependencies after training for correction of the posterior probability of every class to the testing observation vector. We demonstrate the performance of the procedure in the classification of ICA mixtures of two, three, and four classes of synthetic data, and in the classification of defective materials, consisting of 3D finite element models and lab specimens, in non-destructive testing using the impact-echo technique. The application of the proposed posterior probability correction demonstrates an improvement in the classification accuracy. Semi-supervised learning shows that unlabeled data can degrade the performance of the classifier when they do not fit the generative model. Comparative results of the proposed method and standard ICA algorithms for blind source separation in one and multiple ICA data mixtures show the suitability of the non-parametric ICA mixture-based method for data modeling. (C) 2009 Elsevier Ltd. All rights reserved.
引用
收藏
页码:69 / 85
页数:17
相关论文
共 68 条
[41]   Statistical pattern recognition: A review [J].
Jain, AK ;
Duin, RPW ;
Mao, JC .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2000, 22 (01) :4-37
[42]   A new classifier based on information theoretic learning with unlabeled data [J].
Jeong, KH ;
Xu, JW ;
Erdogmus, D ;
Principe, JC .
NEURAL NETWORKS, 2005, 18 (5-6) :719-726
[43]  
JUNG TP, 2000, P 2 INT WORKSH IND C, P633
[44]  
Lappalainen H, 2000, PERSP NEURAL COMP, P93
[45]  
Learned-Miller E.G., 2003, J MACH LEARN RES, V4, P1271
[46]   ICA mixture models for unsupervised classification of non-Gaussian classes and automatic context switching in blind signal separation [J].
Lee, TW ;
Lewicki, MS ;
Sejnowski, TJ .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2000, 22 (10) :1078-1089
[47]   Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources [J].
Lee, TW ;
Girolami, M ;
Sejnowski, TJ .
NEURAL COMPUTATION, 1999, 11 (02) :417-441
[48]   Unsupervised image classification, segmentation, and enhancement using ICA mixture models [J].
Lee, TW ;
Lewicki, MS .
IEEE TRANSACTIONS ON IMAGE PROCESSING, 2002, 11 (03) :270-279
[49]  
Leon-Garcia A., 1994, PROBABILITY RANDOM P
[50]   An on-line ICA-mixture-model-based self-constructing fuzzy neural network [J].
Lin, CT ;
Cheng, WC ;
Liang, SF .
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2005, 52 (01) :207-221