Modeling soft sensor based on sparse least square support vector machine

被引:0
作者
Liu, Ruilan [1 ]
Xu, Yan [2 ]
Rong, Zhou [1 ]
机构
[1] School of Automation, Nanjing University of Post & Telecommunication, Nanjing, 210003, Jiangsu
[2] Light Industry School of Henan Province, Zhengzhou, 450006, Henan
来源
Huagong Xuebao/CIESC Journal | 2015年 / 66卷 / 04期
基金
中国国家自然科学基金;
关键词
4-CBA concentration; Genetic algorithm; Global optimization; Least square support vector machine; Parameter identification; Soft sensor;
D O I
10.11949/j.issn.0438-1157.20141392
中图分类号
学科分类号
摘要
The traditional least squares support vector machine (LSSVM) is generally used to solve non-sparse problems. A sparse and parameter optimization method of LSSVM based on genetic algorithm was proposed. The basic idea of sparse was to give a probability value to each training sample, and if its probability value was less than 0.5 then the corresponding training sample was not a support vector. Samples that was not support vectors were treated as test samples. So, the set of total training samples was divided into the set of test samples and the set of training sample remained. A fitness function including sparse rate, training error and test error was defined. The first N dimensions of the population individual specified corresponding probability of each sample, the next m dimensions specified parameters to be optimized. All parameters including probabilities were optimized globally by mutation, selection, and crossover operations. A model of LSSVM was established by using the corresponding training sample remained and optimized parameters of the individuals with minimum fitness. The proposed method was applied to the soft sensor of 4-CBA concentration in the PX oxidation process. Simulation results with industrial data showed that by using the proposed method sparse rate was up to 87%, kernel parameters were identified automatically, and the sparse model had better generalization capability than that of the model before sparse. ©, 2015, Chemical Industry Press. All right reserved.
引用
收藏
页码:1402 / 1406
页数:4
相关论文
共 22 条
[1]  
Vapnik V., Levin E., Le Cun Y., Measuring the VC dimension of learning machines, Neural Computation, 6, pp. 851-876, (1994)
[2]  
Vapnik V., The Nature of Statistical Learning Theory, (1995)
[3]  
Vapnik V., Statistical Learning Theory, (1998)
[4]  
Cao W., Zhao Y., Gao S., Multi-class support vector machines based on fuzzy kernel cluster, CIESC Journal, 61, 2, pp. 420-424, (2010)
[5]  
Wang A., Li Y., Zhao F., Shi C., Novel semi-supervised classification algorithm based on TSVM, Proceedings of CSEE, 32, 7, pp. 1546-1550, (2011)
[6]  
Yang Z., He J., Shao Y., Feature selection based on linear twin support vector machines, Procedia Computer Science, 17, pp. 1039-1046, (2013)
[7]  
Flores-Fuentes W., Rivas-Lopez M., Sergiyenko O., Et al., Combined application of power spectrum centroid and support vector machines for measurement improvement in optical scanning systems, Signal Processing, 98, pp. 37-51, (2014)
[8]  
Wang B., Sun Y., Ji X., Et al., Soft-sensor modeling for lysine fermentation processes based on PSO_SVM inversion, CIESC Journal, 63, 9, pp. 3000-3007, (2012)
[9]  
Li J., Liu J., Wang J., Mid-long term load forecasting based on simulated annealing and SVM algorithm, Proceedings of CSEE, 31, 16, pp. 63-66, (2011)
[10]  
Wang Z., Xu Z., Zhao J., Shao Z., Coal-fired power plant boiler combustion process modeling based on support vector machine and load data division, CIESC Journal, 64, 12, pp. 4496-4502, (2013)