An Empirical Overview of the No Free Lunch Theorem and Its Effect on Real-World Machine Learning Classification

被引:80
作者
Gomez, David [1 ]
Rojas, Alfonso [1 ]
机构
[1] Univ Politecn Cataluna, Telemat Engn Dept, Barcelona 08034, Spain
关键词
A-PRIORI DISTINCTIONS;
D O I
10.1162/NECO_a_00793
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A sizable amount of research has been done to improve the mechanisms for knowledge extraction such as machine learning classification or regression. Quite unintuitively, the no free lunch (NFL) theorem states that all optimization problem strategies perform equally well when averaged over all possible problems. This fact seems to clash with the effort put forth toward better algorithms. This letter explores empirically the effect of the NFL theorem on some popular machine learning classification techniques over real-world data sets.
引用
收藏
页码:216 / 228
页数:13
相关论文
共 27 条
[1]   Applying support vector machines to imbalanced datasets [J].
Akbani, R ;
Kwek, S ;
Japkowicz, N .
MACHINE LEARNING: ECML 2004, PROCEEDINGS, 2004, 3201 :39-50
[2]   AN INTRODUCTION TO KERNEL AND NEAREST-NEIGHBOR NONPARAMETRIC REGRESSION [J].
ALTMAN, NS .
AMERICAN STATISTICIAN, 1992, 46 (03) :175-185
[3]  
[Anonymous], 2006, NIPS
[4]  
Bache K., 2013, UCI Machine Learning Repository
[5]  
Bell R. M., 2008, The bellkor 2008 solution to the netfix prize
[6]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[7]  
Carletta J, 1996, COMPUT LINGUIST, V22, P249
[8]  
COX DR, 1958, J R STAT SOC B, V20, P215
[9]  
Fernández-Delgado M, 2014, J MACH LEARN RES, V15, P3133
[10]  
Freund Y., 1996, P 13 INT C MACH LEAR, V96, P148, DOI DOI 10.5555/3091696.3091715