Automated learning of mixtures of factor analysis models with missing information

被引:0
作者
Wan-Lun Wang
Tsung-I Lin
机构
[1] Feng Chia University,Department of Statistics, Graduate Institute of Statistics and Actuarial Science
[2] National Chung Hsing University,Institute of Statistics
[3] China Medical University,Department of Public Health
来源
TEST | 2020年 / 29卷
关键词
Automated learning; Factor analysis; Maximum likelihood estimation; Missing values; Model selection; One-stage algorithm; 62H12; 62H25; 62H30;
D O I
暂无
中图分类号
学科分类号
摘要
The mixture of factor analyzers (MFA) model has emerged as a useful tool to perform dimensionality reduction and model-based clustering for heterogeneous data. In seeking the most appropriate number of factors (q) of a MFA model with the number of components (g) fixed a priori, a two-stage procedure is commonly implemented by firstly carrying out parameter estimation over a set of prespecified numbers of factors, and then selecting the best q according to certain penalized likelihood criteria. When the dimensionality of data grows higher, such a procedure can be computationally prohibitive. To overcome this obstacle, we develop an automated learning scheme, called the automated MFA (AMFA) algorithm, to effectively merge parameter estimation and selection of q into a one-stage algorithm. The proposed AMFA procedure that allows for much lower computational cost is also extended to accommodate missing values. Moreover, we explicitly derive the score vector and the empirical information matrix for calculating standard errors associated with the estimated parameters. The potential and applicability of the proposed method are demonstrated through a number of real datasets with genuine and synthetic missing values.
引用
收藏
页码:1098 / 1124
页数:26
相关论文
共 50 条
[21]   Inference for large dimensional factor models under general missing data patterns [J].
Su, Liangjun ;
Wang, Fa .
JOURNAL OF ECONOMETRICS, 2025, 250
[22]   Choosing the number of factors in factor analysis with incomplete data via a novel hierarchical Bayesian information criterion [J].
Zhao, Jianhua ;
Shang, Changchun ;
Li, Shulan ;
Xin, Ling ;
Yu, Philip L. H. .
ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2025, 19 (01) :209-235
[23]   Photoelectric factor prediction using automated learning and uncertainty quantification [J].
Alsamadony, Khalid ;
Ibrahim, Ahmed Farid ;
Elkatatny, Salaheldin ;
Abdulraheem, Abdulazeez .
NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30) :22595-22604
[24]   Photoelectric factor prediction using automated learning and uncertainty quantification [J].
Khalid Alsamadony ;
Ahmed Farid Ibrahim ;
Salaheldin Elkatatny ;
Abdulazeez Abdulraheem .
Neural Computing and Applications, 2023, 35 :22595-22604
[25]   Aggregate versus disaggregate information in dynamic factor models [J].
Alvarez, Rocio ;
Camacho, Maximo ;
Perez-Quiros, Gabriel .
INTERNATIONAL JOURNAL OF FORECASTING, 2016, 32 (03) :680-694
[26]   Bayesian sensitivity models for missing covariates in the analysis of survival data [J].
Hemming, Karla ;
Hutton, Jane Luise .
JOURNAL OF EVALUATION IN CLINICAL PRACTICE, 2012, 18 (02) :238-246
[27]   Incremental general regression with expectation maximization for learning finite mixtures using data with missing values [J].
Abas, Ahmed R. .
WORLD CONGRESS ON COMPUTER & INFORMATION TECHNOLOGY (WCCIT 2013), 2013,
[28]   Robust model-based clustering via mixtures of skew-t distributions with missing information [J].
Wang, Wan-Lun ;
Lin, Tsung-I .
ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2015, 9 (04) :423-445
[29]   Forecasting a Journal Impact Factor Under Missing Values Based on Machine Learning [J].
Hua, Van ;
Huynh, Bao .
IEEE ACCESS, 2024, 12 :85745-85760
[30]   Model selection information criteria in latent class models with missing data and contingency question [J].
Lin, Ting Hsiang .
JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2014, 84 (01) :159-170