Estimating the number of components in Gaussian mixture models adaptively for medical image

被引:8
作者
Xie, Cong-Hua [1 ]
Chang, Jin-Yi [1 ]
Liu, Yong-Jun [1 ]
机构
[1] Changshu Inst Technol, Sch Comp Sci & Engn, Suzhou, Jiangsu, Peoples R China
来源
OPTIK | 2013年 / 124卷 / 23期
关键词
Gaussian mixture models; Model selection; Expectation-maximization; Medical image density estimation; EMPIRICAL CHARACTERISTIC FUNCTION; AUTOMATIC SEGMENTATION; CROSS-VALIDATION; SELECTION;
D O I
10.1016/j.ijleo.2013.05.028
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
An important but difficult problem of Gaussian mixture models (GMM) for medical image analysis is estimating and testing the number of components by model selection criterion. There are many available methods to estimate the k based on likelihood function. However, some of them need the maximum number of components is known as priori and data is usually over-fitted by them when log-likelihood function is far larger than penalty function. We investigate the log-characteristic function of the GMM to estimate the number of models adaptively for medical image. Our method defines the sum of weighted real parts of all log-characteristic functions of the GMM as a new convergent function and model selection criterion. Our new model criterion makes use of the stability of the sum of weighted real parts of all log-characteristic functions of the GMM when the number of components is larger than the true number of components. The univariate acidity, simulated 2D datasets and real 2D medical images are used to test and experiment results suggest that our method without any priori is more suited for large sample applications than other typical methods. (C) 2013 Elsevier GmbH. All rights reserved.
引用
收藏
页码:6216 / 6221
页数:6
相关论文
共 38 条
[11]   Bayesian mixture models of variable dimension for image segmentation [J].
Ferreira da Silva, Adelino R. .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2009, 94 (01) :1-14
[12]   Unsupervised learning of finite mixture models [J].
Figueiredo, MAT ;
Jain, AK .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2002, 24 (03) :381-396
[13]  
Fraley C, 2007, J CLASSIF, V24, P155, DOI [10.1007/s00357-007-0004-5, 10.1007/s00357-007-0004-z]
[14]   Constrained Gaussian mixture model framework for automatic segmentation of MR brain images [J].
Greenspan, Hayit ;
Ruf, Amit ;
Goldberger, Jacob .
IEEE TRANSACTIONS ON MEDICAL IMAGING, 2006, 25 (09) :1233-1245
[15]   Model selection and the principle of minimum description length [J].
Hansen, MH ;
Yu, B .
JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2001, 96 (454) :746-774
[16]   Estimation of continuous-time processes via the empirical characteristic function [J].
Jiang, GJ ;
Knight, JL .
JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2002, 20 (02) :198-212
[17]   Fully automatic segmentation of multiple sclerosis lesions in brain MR FLAIR images using adaptive mixtures method and markov random field model [J].
Khayati, Rasoul ;
Vafadust, Mansur ;
Towhidkhah, Farzad ;
Nabavi, S. Massood .
COMPUTERS IN BIOLOGY AND MEDICINE, 2008, 38 (03) :379-390
[18]   Estimation of the stochastic volatility model by the empirical characteristic function method [J].
Knight, JL ;
Satchell, SE ;
Yu, J .
AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2002, 44 (03) :319-335
[19]   Empirical characteristic function in time series estimation [J].
Knight, JL ;
Yu, J .
ECONOMETRIC THEORY, 2002, 18 (03) :691-721
[20]   A new R package for Bayesian estimation of multivariate normal mixtures allowing for selection of the number of components and interval-censored data [J].
Komarek, Arnost .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2009, 53 (12) :3932-3947