Predicting Slaughter Weight in Pigs with Regression Tree Ensembles

被引:9
作者
Alsahaf, A. [1 ]
Azzopardi, G. [1 ]
Ducro, B. [2 ]
Veerkamp, R. F. [2 ]
Petkov, N. [1 ]
机构
[1] Univ Groningen, Johann Bernoulli Inst Math & Comp Sci, Groningen, Netherlands
[2] Wageningen Univ & Res, Wageningen, Netherlands
来源
APPLICATIONS OF INTELLIGENT SYSTEMS | 2018年 / 310卷
关键词
random forest; XGBoost; ensemble learning; gradient boosting; pigs; animal production;
D O I
10.3233/978-1-61499-929-4-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domestic pigs vary in the age at which they reach slaughter weight even under the controlled conditions of modern pig farming. Early and accurate estimates of when a pig will reach slaughter weight can lead to logistic efficiency in farms. In this study, we compare four methods in predicting the age at which a pig reaches slaughter weight (120 kg). Namely, we compare the following regression tree-based ensemble methods: random forest (RF), extremely randomized trees (ET), gradient boosted machines (GBM), and XGBoost. Data from 32979 pigs is used, comprising a combination of phenotypic features and estimated breeding values (EBV). We found that the boosting ensemble methods, GBM and XGBoost, achieve lower prediction errors than the parallel ensembles methods, RF and ET. On the other hand, RF and ET have fewer parameters to tune, and perform adequately well with default parameter settings.
引用
收藏
页码:1 / 9
页数:9
相关论文
共 13 条
[1]   The role of pig size prediction in supply chain planning [J].
Apichottanakul, Arthit ;
Pathumnakul, Supachai ;
Piewthongngam, Kullapapruk .
BIOSYSTEMS ENGINEERING, 2012, 113 (03) :298-307
[2]   Analyzing pork carcass evaluation technologies in a swine bioeconomic model [J].
Boland, MA ;
Foster, KA ;
Preckel, PV ;
Schinckel, AP .
JOURNAL OF PRODUCTION AGRICULTURE, 1996, 9 (01) :45-49
[3]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[4]   Bagging predictors [J].
Breiman, L .
MACHINE LEARNING, 1996, 24 (02) :123-140
[5]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[6]  
Dorogush A., CatBoost: gradient boosting with categorical features support
[7]   A decision-theoretic generalization of on-line learning and an application to boosting [J].
Freund, Y ;
Schapire, RE .
JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 1997, 55 (01) :119-139
[8]   Greedy function approximation: A gradient boosting machine [J].
Friedman, JH .
ANNALS OF STATISTICS, 2001, 29 (05) :1189-1232
[9]   Stochastic gradient boosting [J].
Friedman, JH .
COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2002, 38 (04) :367-378
[10]   Extremely randomized trees [J].
Geurts, P ;
Ernst, D ;
Wehenkel, L .
MACHINE LEARNING, 2006, 63 (01) :3-42