An efficient method for feature selection in linear regression based on an extended Akaike's information criterion

被引:1
|
作者
Vetrov, D. P. [1 ]
Kropotov, D. A. [2 ]
Ptashko, N. O. [1 ]
机构
[1] Moscow MV Lomonosov State Univ, Fac Computat Math & Cybernet, Moscow 119992, Russia
[2] Russian Acad Sci, Dorodnicyn Comp Ctr, Moscow 119333, Russia
基金
俄罗斯基础研究基金会;
关键词
pattern recognition; linear regression; feature selection; Akaike's information criterion;
D O I
10.1134/S096554250911013X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
A method for feature selection in linear regression based on an extension of Akaike's information criterion is proposed. The use of classical Akaike's information criterion (AIC) for feature selection assumes the exhaustive search through all the subsets of features, which has unreasonably high computational and time cost. A new information criterion is proposed that is a continuous extension of AIC. As a result, the feature selection problem is reduced to a smooth optimization problem. An efficient procedure for solving this problem is derived. Experiments show that the proposed method enables one to efficiently select features in linear regression. In the experiments, the proposed procedure is compared with the relevance vector machine, which is a feature selection method based on Bayesian approach. It is shown that both procedures yield similar results. The main distinction of the proposed method is that certain regularization coefficients are identical zeros. This makes it possible to avoid the underfitting effect, which is a characteristic feature of the relevance vector machine. A special case (the so-called nondiagonal regularization) is considered in which both methods are identical.
引用
收藏
页码:1972 / 1985
页数:14
相关论文
共 50 条
  • [1] An efficient method for feature selection in linear regression based on an extended Akaike’s information criterion
    D. P. Vetrov
    D. A. Kropotov
    N. O. Ptashko
    Computational Mathematics and Mathematical Physics, 2009, 49 : 1972 - 1985
  • [2] A new segmentation method of electroencephalograms by use of Akaike's information criterion
    Inouye, T
    Toi, S
    Matsumoto, Y
    COGNITIVE BRAIN RESEARCH, 1995, 3 (01): : 33 - 40
  • [3] An Effective Feature Selection Method Using Dynamic Information Criterion
    Liu, Huawen
    Li, Minshuo
    Zhao, Jianmin
    Mo, Yuchang
    ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT I, 2011, 7002 : 450 - 455
  • [4] Linear regression-based feature selection for microarray data classification
    Hasan, Md Abid
    Hasan, Md Kamrul
    Mottalib, M. Abdul
    INTERNATIONAL JOURNAL OF DATA MINING AND BIOINFORMATICS, 2015, 11 (02) : 167 - 179
  • [5] Akaike's Information Criterion for Stoichiometry Inference of Supramolecular Complexes
    Ikemoto, Koki
    Takahashi, Kanato
    Ozawa, Takeaki
    Isobe, Hiroyuki
    ANGEWANDTE CHEMIE-INTERNATIONAL EDITION, 2023, 62 (14)
  • [6] SELECTION OF A MATHEMATICAL MODEL FOR THE KINETICS OF Haemophilus influenzae TYPE B USING AKAIKE'S INFORMATION CRITERION
    Cintra, F. de O.
    Takagi, M.
    BRAZILIAN JOURNAL OF CHEMICAL ENGINEERING, 2018, 35 (04) : 1305 - 1314
  • [7] On the Feature Selection Criterion Based on an Approximation of Multidimensional Mutual Information
    Balagani, Kiran S.
    Phoha, Vir V.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2010, 32 (07) : 1342 - 1343
  • [8] Estimation of leaf area index of winter wheat based on akaike's information criterion
    Yang, Fuqin
    Feng, Haikuan
    Li, Zhenhai
    Jin, Xiuliang
    Yang, Guijun
    Dai, Huayang
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2015, 46 (11): : 112 - 120and164
  • [9] Estimation of Leaf Nitrogen Content of Winter Wheat Based on Akaike's Information Criterion
    Pei, Haojie
    Feng, Haikuan
    Yang, Fuqin
    Li, Zhenhai
    Yang, Guijun
    Niu, Qinglin
    COMPUTER AND COMPUTING TECHNOLOGIES IN AGRICULTURE XI, CCTA 2017, PT II, 2019, 546 : 231 - 240
  • [10] Carousel Greedy Algorithms for Feature Selection in Linear Regression
    Wang, Jiaqi
    Golden, Bruce
    Cerrone, Carmine
    ALGORITHMS, 2023, 16 (09)