Robust boosting neural networks with random weights for multivariate calibration of complex samples

被引:14
作者
Bian, Xihui [1 ,2 ]
Diwu, Pengyao [1 ,2 ]
Zhang, Caixia [1 ,2 ]
Lin, Ligang [1 ]
Chen, Guohui [2 ]
Tan, Xiaoyao [1 ,2 ]
Guo, Yugao [2 ]
Cheng, Bowen [1 ]
机构
[1] Tianjin Polytech Univ, State Key Lab Separat Membranes & Membrane Proc, Tianjin 300387, Peoples R China
[2] Tianjin Polytech Univ, Sch Environm & Chem Engn, Tianjin 300387, Peoples R China
基金
中国国家自然科学基金;
关键词
Ensemble modeling; Boosting; Neural networks with random weights; Extreme learning machine; Outlier; Complex samples; EXTREME LEARNING-MACHINE; PARTIAL LEAST-SQUARES; NEAR-INFRARED SPECTROSCOPY; SPECTRAL QUANTITATIVE-ANALYSIS; SUPPORT VECTOR REGRESSION; OIL SAMPLES; IMPROVEMENT; FUEL; QSAR; PLS;
D O I
10.1016/j.aca.2018.01.013
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Neural networks with random weights (NNRW) has been used for regression due to its excellent performance. However, NNRW is sensitive to outliers and unstable to some extent in dealing with the real-world complex samples. To overcome these drawbacks, a new method called robust boosting NNRW (RBNNRW) is proposed by integrating a robust version of boosting with NNRW. The method builds a large number of NNRW sub-models sequentially by robustly reweighted sampling from the original training set and then aggregates these predictions by weighted median. The performance of RBNNRW is tested with three spectral datasets of wheat, light gas oil and diesel fuel samples. As comparisons to RBNNRW, the conventional PLS, NNRW and boosting NNRW (BNNRW) have also been investigated. The results demonstrate that the introduction of robust boosting greatly enhances the stability and accuracy of NNRW. Moreover, RBNNRW is superior to BNNRW particularly when outliers exist. (c) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:20 / 26
页数:7
相关论文
共 43 条
[1]   A Robust Extreme Learning Machine for pattern classification with outliers [J].
Barreto, Guilherme A. ;
Barros, Ana Luiza B. P. .
NEUROCOMPUTING, 2016, 176 :3-13
[2]   Spectral quantitative analysis of complex samples based on the extreme learning machine [J].
Bian, Xi-Hui ;
Li, Shu-Juan ;
Fan, Meng-Ran ;
Guo, Yu-Gao ;
Chang, Na ;
Wang, Jiang-Jiang .
ANALYTICAL METHODS, 2016, 8 (23) :4674-4679
[3]  
Bian XH, 2017, ANAL METHODS-UK, V9, P2983, DOI [10.1039/C7AY00353F, 10.1039/c7ay00353f]
[4]   Variable space boosting partial least squares for multivariate calibration of near-infrared spectroscopy [J].
Bian, Xihui ;
Li, Shujuan ;
Shao, Xueguang ;
Liu, Peng .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2016, 158 :174-179
[5]   High and low frequency unfolded partial least squares regression based on empirical mode decomposition for quantitative analysis of fuel oil samples [J].
Bian, Xihui ;
Li, Shujuan ;
Lin, Ligang ;
Tan, Xiaoyao ;
Fan, Qingjie ;
Li, Ming .
ANALYTICA CHIMICA ACTA, 2016, 925 :16-22
[6]   Detecting influential observations by cluster analysis and Monte Carlo cross-validation [J].
Bian, Xihui ;
Cai, Wensheng ;
Shao, Xueguang ;
Chen, Da ;
Grant, Edward R. .
ANALYST, 2010, 135 (11) :2841-2847
[7]   The boosting: A new idea of building models [J].
Cao, Dong-Sheng ;
Xu, Qing-Song ;
Liang, Yi-Zeng ;
Zhang, Liang-Xiao ;
Li, Hong-Dong .
CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2010, 100 (01) :1-11
[8]   A probabilistic learning algorithm for robust modeling using neural networks with random weights [J].
Cao, Feilong ;
Ye, Hailiang ;
Wang, Dianhui .
INFORMATION SCIENCES, 2015, 313 :62-78
[9]   Quantitative differentiation of multiple virus in blood using nanoporous silicon oxide immunosensor and artificial neural network [J].
Chakraborty, W. ;
Ray, R. ;
Samanta, N. ;
RoyChaudhuri, C. .
BIOSENSORS & BIOELECTRONICS, 2017, 98 :180-188
[10]  
de Chazal P, 2015, INT CONF ACOUST SPEE, P2165, DOI 10.1109/ICASSP.2015.7178354