A note on the comparison of polynomial selection methods

被引:0
作者
Viswanathan, M [1 ]
Wallace, C [1 ]
机构
[1] Monash Univ, Sch Comp Sci & Software Engn, Clayton, Vic 3168, Australia
来源
ARTIFICIAL INTELLIGENCE AND STATISTICS 99, PROCEEDINGS | 1999年
关键词
polynomial approximation; regression; polynomial model selection; minimum message length; structural risk minimization; VC-dimension;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Minimum Message Length (MML) [2] and Structural Risk Minimisation (SRM) [5] are two computational learning principles that have achieved wide acclaim in recent years. Whereas the former is based on Bayesian learning and the latter on the classical theory of VC-dimension, they are similar in their attempt to define a trade-off between model complexity and goodness of fit to the data. A recent empirical study by Wallace [1] compared the performance of standard model selection methods in a one-dimensional polynomial regression framework. The results from this study provided strong evidence in support of the MML and SRM based methods over the other standard approaches. In this paper we present a detailed empirical evaluation of three model selection methods which include an MML based approach and two SRM based methods. Results from our analysis and experimental evaluation suggest that the MML-based approach in general has higher predictive accuracy and also raise questions on the inductive capabilities of the Structural Risk Minimization Principle.
引用
收藏
页码:169 / 177
页数:9
相关论文
共 50 条
  • [1] Minimum message length and classical methods for model selection in univariate polynomial regression
    Viswanathan, M
    Yang, Y
    Whangbo, TK
    ETRI JOURNAL, 2005, 27 (06) : 747 - 758
  • [2] An experimental and theoretical comparison of model selection methods
    Kearns, M
    Mansour, Y
    Ng, AY
    Ron, D
    MACHINE LEARNING, 1997, 27 (01) : 7 - 50
  • [3] An Experimental and Theoretical Comparison of Model Selection Methods
    Michael Kearns
    Yishay Mansour
    Andrew Y. Ng
    Dana Ron
    Machine Learning, 1997, 27 : 7 - 50
  • [4] Comparison of Bayesian predictive methods for model selection
    Piironen, Juho
    Vehtari, Aki
    STATISTICS AND COMPUTING, 2017, 27 (03) : 711 - 735
  • [5] Comparison of variable selection methods for clinical predictive modeling
    Sanchez-Pinto, L. Nelson
    Venable, Laura Ruth
    Fahrenbach, John
    Churpek, Matthew M.
    INTERNATIONAL JOURNAL OF MEDICAL INFORMATICS, 2018, 116 : 10 - 17
  • [6] A note on polynomial approximation in Sobolev spaces
    Verfürth, R
    RAIRO-MATHEMATICAL MODELLING AND NUMERICAL ANALYSIS-MODELISATION MATHEMATIQUE ET ANALYSE NUMERIQUE, 1999, 33 (04): : 715 - 719
  • [7] Feature selection for pattern recognition by LASSO and thresholding methods - a comparison
    Libal, Urszula
    2011 16TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS, 2011, : 168 - 173
  • [8] Comparison of Feature Selection Methods-Modelling COPD Outcomes
    Cabral, Jorge
    Macedo, Pedro
    Marques, Alda
    Afreixo, Vera
    MATHEMATICS, 2024, 12 (09)
  • [9] Comparison of shrinkage and feature selection methods for modeling GDP of Pakistan
    Dar, Irum Sajjad
    Afzal, Muhammad
    Khalil, Sadia
    Shamim, Maira
    JOURNAL OF STATISTICS AND MANAGEMENT SYSTEMS, 2022, 25 (04) : 957 - 969
  • [10] A global selection procedure for polynomial interpolators
    Bates, RA
    Giglio, B
    Wynn, HP
    TECHNOMETRICS, 2003, 45 (03) : 246 - 255