A family of the information criteria using the phi-divergence for categorical data

被引:1
作者
Ogasawara, Haruhiko [1 ]
机构
[1] Otaru Univ, 3-5-21 Midori, Otaru, Hokkaido 0478501, Japan
关键词
Power divergence; Risk; Model selection; Asymptotic bias; Akaike information criterion; LOGLINEAR MODELS; CROSS-VALIDATION; SELECTION; ESTIMATORS; SQUARE; CHOICE;
D O I
10.1016/j.csda.2018.03.001
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The risk of the phi-divergence of a statistical model for categorical data is defined using two independent sets of data. The asymptotic bias of the phi-divergence based on current data as an estimator of the risk is shown to be equal to the negative penalty term of the Akaike information criterion (AIC). Though the higher-order asymptotic bias is derived, the higher-order bias depends on the form of the phi-divergence and the estimation method of parameters using a possible different form of the phi-divergence. An approximation to the higher-order bias is obtained based on the simple result of the saturated model. The information criteria using this approximation yield improved results in simulations for model selection. Some cases of the phi-divergences show advantages over the AIC in simulations. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:87 / 103
页数:17
相关论文
共 50 条
  • [21] Ultra-high-dimensional feature screening of binary categorical response data based on Jensen-Shannon divergence
    Jiang, Qingqing
    Deng, Guangming
    AIMS MATHEMATICS, 2024, 9 (02): : 2874 - 2907
  • [22] Using Experimental Data and Information Criteria to Guide Model Selection for Reaction–Diffusion Problems in Mathematical Biology
    David J. Warne
    Ruth E. Baker
    Matthew J. Simpson
    Bulletin of Mathematical Biology, 2019, 81 : 1760 - 1804
  • [23] Normalized Information Criteria and Model Selection in the Presence of Missing Data
    Cohen, Nitzan
    Berchenko, Yakir
    MATHEMATICS, 2021, 9 (19)
  • [24] Model selection in multivariate adaptive regressions splines (MARS) using alternative information criteria
    Adiguzel, Meryem Bekar
    Cengiz, Mehmet Ali
    HELIYON, 2023, 9 (09)
  • [25] QUANTIFYING INFORMATION FLOW IN FMRI USING THE KULLBAKC-LEIBLER DIVERGENCE
    Seghouane, Abd-Krim
    2011 8TH IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING: FROM NANO TO MACRO, 2011, : 1569 - 1572
  • [26] Mutual Information Scoring: Increasing Interpretability in Categorical Clustering Tasks with Applications to Child Welfare Data
    Sankhe, Pranav
    Hall, Seventy F.
    Sage, Melanie
    Rodriguez, Maria Y.
    Chandola, Varun
    Joseph, Kenneth
    SOCIAL, CULTURAL, AND BEHAVIORAL MODELING (SBP-BRIMS 2022), 2022, 13558 : 165 - 175
  • [27] Error Statistics Using the Akaike and Bayesian Information Criteria
    Cheng, Henrique
    Sterner, Beckett
    ERKENNTNIS, 2024,
  • [28] Model slection using information criteria and genetic algorithms
    Balcombe K.G.
    Computational Economics, 2005, 25 (3) : 207 - 228
  • [29] MiCS-P:Parallel mutual-information computation of big categorical data on spark
    Li, Junli
    Zhang, Chaowei
    Zhang, Jifu
    Qin, Xiao
    Hu, Lihua
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2022, 161 : 118 - 129
  • [30] Using Experimental Data and Information Criteria to Guide Model Selection for Reaction-Diffusion Problems in Mathematical Biology
    Warne, David J.
    Baker, Ruth E.
    Simpson, Matthew J.
    BULLETIN OF MATHEMATICAL BIOLOGY, 2019, 81 (06) : 1760 - 1804