Nonlinear process monitoring and fault isolation using extended maximum variance unfolding

被引:37
作者
Liu, Yuan-Jui [1 ]
Chen, Tao [2 ]
Yao, Yuan [1 ]
机构
[1] Natl Tsing Hua Univ, Dept Chem Engn, Hsinchu 31003, Taiwan
[2] Univ Surrey, Dept Chem & Proc Engn, Guildford GU2 7XH, Surrey, England
关键词
Process monitoring; Fault isolation; Nonlinear; Manifold learning; Maximum variance unfolding; PRINCIPAL COMPONENT ANALYSIS; DIMENSIONALITY REDUCTION; IMPUTATION; DIAGNOSIS;
D O I
10.1016/j.jprocont.2014.04.004
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Kernel principal component analysis (KPCA) has become a popular technique for process monitoring, owing to its capability of handling nonlinearity. Nevertheless, KPCA suffers from two major disadvantages. First, the underlying manifold structure of data is not considered in process modeling. Second, the selection of kernel parameters is problematic. To avoid such deficiencies, a manifold learning technique named maximum variance unfolding (MVU) is considered as an alternative. However, such method is merely able to deal with the training data, but has no means to handle new samples. Therefore, MVU cannot be applied to process monitoring directly. In this paper, an extended MVU (EMVU) method is proposed, extending the utilization of MVU to new samples by approximating the nonlinear mapping between the input space and the output space with a Gaussian process model. Consequently, EMVU is suitable to nonlinear process monitoring. A cross-validation algorithm is designed to determine the dimensionality of the EMVU output space. For online monitoring, three different types of monitoring indices are developed, including squared prediction error (SPE), Hotelling-T-2, and the prediction variance of the outputs. In addition, a fault isolation algorithm based on missing data analysis is designed for EMVU to identify the variables contributing most to the faults. The effectiveness of the proposed methods is verified by the case studies on a numerical simulation and the benchmark Tennessee Eastman (TE) process. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:880 / 891
页数:12
相关论文
共 38 条
[1]   Laplacian eigenmaps for dimensionality reduction and data representation [J].
Belkin, M ;
Niyogi, P .
NEURAL COMPUTATION, 2003, 15 (06) :1373-1396
[2]   CSDP, a C library for semidefinite programming [J].
Borchers, B .
OPTIMIZATION METHODS & SOFTWARE, 1999, 11-2 (1-4) :613-623
[3]  
Bowman AW., 1997, Applied Smoothing Techniques for Data Analysis: The Kernel Approach with S-Plus Illustrations, Vvol. 18
[4]  
Boyle P, 2005, ADV NEURAL INFORM PR, P217
[5]  
Chen T, 2007, CHEMOMETR INTELL LAB, V87, P59, DOI 10.1016/j.chemolab.2006.09.004
[6]  
Conlin AK, 2000, J CHEMOMETR, V14, P725, DOI 10.1002/1099-128X(200009/12)14:5/6<725::AID-CEM611>3.0.CO
[7]  
2-8
[8]  
De Silva V., 2002, Advances in neural information processing systems, P705
[9]   Review: A gentle introduction to imputation of missing values [J].
Donders, A. Rogier T. ;
van der Heijden, Geert J. M. G. ;
Stijnen, Theo ;
Moons, Karel G. M. .
JOURNAL OF CLINICAL EPIDEMIOLOGY, 2006, 59 (10) :1087-1091
[10]   Nonlinear principal component analysis - Based on principal curves and neural networks [J].
Dong, D ;
McAvoy, TJ .
COMPUTERS & CHEMICAL ENGINEERING, 1996, 20 (01) :65-78