Generalization ability of fractional polynomial models

被引:1
作者
Lei, Yunwen [1 ]
Ding, Lixin [1 ]
Ding, Yiming [2 ]
机构
[1] Wuhan Univ, Sch Comp, State Key Lab Software Engn, Wuhan 430072, Peoples R China
[2] Chinese Acad Sci, Wuhan Inst Phys & Math, Wuhan 430071, Peoples R China
基金
中国国家自然科学基金;
关键词
Learning algorithm; Learning theory; Fractional polynomial; Model selection; Approximation theory; NONLINEAR LEAST-SQUARES; COVERING NUMBER; NEURAL-NETWORKS; VC-DIMENSION; SELECTION; BOUNDS; CLASSIFICATION; APPROXIMATION; REGRESSION; COMPLEXITY;
D O I
10.1016/j.neunet.2013.09.009
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the problem of learning the functional dependency between input and output variables from scattered data using fractional polynomial models (FPM) is investigated. The estimation error bounds are obtained by calculating the pseudo-dimension of FPM, which is shown to be equal to that of sparse polynomial models (SPM). A linear decay of the approximation error is obtained for a class of target functions which are dense in the space of continuous functions. We derive a structural risk analogous to the Schwartz Criterion and demonstrate theoretically that the model minimizing this structural risk can achieve a favorable balance between estimation and approximation errors. An empirical model selection comparison is also performed to justify the usage of this structural risk in selecting the optimal complexity index from the data. We show that the construction of FPM can be efficiently addressed by the variable projection method. Furthermore, our empirical study implies that FPM could attain better generalization performance when compared with SPM and cubic splines. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:59 / 73
页数:15
相关论文
共 40 条