Ensemble Feature Selection With Block-Regularized m x 2 Cross-Validation

被引:2
|
作者
Yang, Xingli [1 ]
Wang, Yu [2 ]
Wang, Ruibo [2 ]
Li, Jihong [2 ]
机构
[1] Shanxi Univ, Sch Math Sci, Taiyuan 030006, Peoples R China
[2] Shanxi Univ, Sch Modern Educ Technol, Taiyuan 030006, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Correlation; Indexes; Data models; Technological innovation; Reliability theory; Upper bound; Beta distribution; block-regularized m x 2 cross-validation; ensemble feature selection (EFS); false positive; true positive; VARIABLE SELECTION; REGRESSION; PRECISION; RECALL;
D O I
10.1109/TNNLS.2021.3128173
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensemble feature selection (EFS) has attracted significant interest in the literature due to its great potential in reducing the discovery rate of noise features and stabilizing the feature selection results. In view of the superior performance of block-regularized m x 2 cross-validation on generalization performance and algorithm comparison, a novel EFS technology based on block-regularized m x 2 cross-validation is proposed in this study. Contrary to the traditional ensemble learning with a binomial distribution, the distribution of feature selection frequency in the proposed technique is approximated by a beta distribution more accurately. Furthermore, theoretical analysis of the proposed technique shows that it yields a higher selection probability for important features, lower selected risk for noise features, more true positives, and fewer false positives. Finally, the above conclusions are verified by the simulated and real data experiments.
引用
收藏
页码:6628 / 6641
页数:14
相关论文
empty
未找到相关数据