Expectation-maximization approaches to independent component analysis

被引:4
作者
Zhong, MJ
Tang, HW
Tang, YY [1 ]
机构
[1] Dalian Univ Technol, Inst Neuroinformat, Dalian 116023, Peoples R China
[2] Dalian Univ Technol, Inst Computat Biol & Bioinformat, Dalian 116023, Peoples R China
[3] Chinese Acad Sci, Lab Visual Informat Proc, Beijing 100101, Peoples R China
[4] Chinese Acad Sci, Key Lab Mental Hlth, Beijing 100101, Peoples R China
基金
中国国家自然科学基金;
关键词
independent component analysis; overcomplete representations; EM algorithm; variational method;
D O I
10.1016/j.neucom.2004.06.007
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Expectation-Maximization (EM) algorithms for independent component analysis are presented in this paper. For super-Gaussian sources, a variational method is employed to develop an EM algorithm in closed form for learning the mixing matrix and inferring the independent components. For sub-Gaussian sources, a symmetrical form of the Pearson mixture model (Neural Comput. 11 (2) (1999) 417-441) is used as the prior, which also enables the development of an EM algorithm in fclosed form for parameter estimation. (C) 2004 Elsevier B.V. All rights reserved.
引用
收藏
页码:503 / 512
页数:10
相关论文
共 50 条