New estimation and feature selection methods in mixture-of-experts models

被引:37
作者
Khalili, Abbas [1 ]
机构
[1] McGill Univ, Dept Math & Stat, Montreal, PQ H3A 2K6, Canada
来源
CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE | 2010年 / 38卷 / 04期
基金
加拿大自然科学与工程研究理事会;
关键词
Mixture-of experts; regularization; LASSO; SCAD; EM algorithms; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; HIERARCHICAL MIXTURES; LOGISTIC-REGRESSION; EM ALGORITHM; MAXIMUM-LIKELIHOOD; BAYESIAN-INFERENCE; IDENTIFIABILITY; ARCHITECTURES;
D O I
10.1002/cjs.10083
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study estimation and feature selection problems in mixture of-experts models An l(2) penalized maximum likelihood estimator is proposed as an alternative to the ordinary maximum likelihood estimator The estimator is particularly advantageous when fitting a mixture-of experts model to data with many correlated features It is shown that the proposed estimator is root n consistent and simulations show its superior finite sample behaviour compared to that of the maximum likelihood estimator For feature selection two extra penalty functions are applied to the l(2)-penalized log likelihood function The proposed feature selection method is computationally much more efficient than the popular all-subset selection methods Theoretically it is shown that the method is consistent in feature selection and simulations support our theoretical results A real-data example is presented to demonstrate the method The Canadian Journal of Statistics 38 519-539 2010 (C) 2010 Statistical Society of Canada
引用
收藏
页码:519 / 539
页数:21
相关论文
共 29 条