• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Latent Variable Regression for Supervised Modeling and Monitoring

    2020-05-21 05:44:38QinqinZhu
    IEEE/CAA Journal of Automatica Sinica 2020年3期

    Qinqin Zhu

    Abstract—A latent variable regression algorithm with a regularization term (rLVR) is proposed in this paper to extract latent relations between process data X and quality data Y. In rLVR,the prediction error between X and Y is minimized, which is proved to be equivalent to maximizing the projection of quality variables in the latent space. The geometric properties and model relations of rLVR are analyzed, and the geometric and theoretical relations among rLVR, partial least squares, and canonical correlation analysis are also presented. The rLVR-based monitoring framework is developed to monitor process-relevant and quality-relevant variations simultaneously. The prediction and monitoring effectiveness of rLVR algorithm is demonstrated through both numerical simulations and the Tennessee Eastman(TE) process.

    I. Introduction

    IN industrial processes, timely process monitoring is of great importance, which helps detect potential hazards and enhance operation safety in the processes, thus contributing substantially to tomorrow’s industry and imparting significant economic benefits. Traditionally, the routine examinations by experienced personnel were the major approach to detect anomalies, which, however, is prone to error and not completely reliable and comprehensive. With the advancement of technologies in data collection, transmission and storage, a new effective monitoring scheme based on multivariate analytical methods has emerged to track variations in the process in a timely and reliable fashion, and it is widely applied in chemical engineering, biology, pharmaceutical engineering, and management science [1]–[6]. Among them, principal component analysis (PCA), partial least squares (PLS) and canonical correlation analysis (CCA) are three popular and effective algorithms used in multivariate process monitoring.

    PCA is a powerful tool to discover important patterns and reduce the dimension of process data, and it decomposes the original process space into the principal component subspace(PCS) with large variances and the residual subspace (RS)which mainly contains noise [7]. The monitoring scheme based on PCA is well defined to detect anomalies in PCS withT2statistic and those in RS withQstatistic [1], [8]. In industrial processes, product quality is of major concern.PCA-based monitoring scheme, however, fails to build the connection betweenXand quality variablesY, and the information ofYis not available in both modeling and monitoring stages, which makes it hard to identify whether the faulty samples will affect product quality. Thus, supervised algorithms such as PLS and CCA are preferred.

    PLS extracts the latent variables by maximizing the covariance betweenXandY, thus quality information is successfully captured in the latent model. Since PLS pays attention to both process and quality variables, the captured latent variables contain variations that are orthogonal or irrelevant toY, and further decomposition is necessary for comprehensive modeling of PLS [9]–[11]. Another issue involved in PLS is that its objectives of outer and inner modeling are different: the outer model is to maximize covariance betweenXandY, while inner model tries to minimize regression errors between process and quality scores.The discrepancy reduces the prediction efficiency of PLS.

    Similar to PLS, CCA works as a supervised modeling algorithm to construct its latent space with the supervision of quality variables. The latent variables are extracted by maximizing the correlation betweenXandY, and all the information contained in the latent variables is relevant toY,thus CCA can get rid of the effect of process variances and retain a better prediction performance than PLS. The remaining information left in the process and quality variables may still be valuable to reveal abnormal variations in the data,which can contribute to the improvement of operation safety and economic efficiency; however, these variations remain unexploited in CCA, and concurrent CCA was proposed to conduct a full decomposition onXandY[12].

    T2andQstatistics are employed to monitor variations in PCS and RS subspaces for PLS and CCA as well, which obtain satisfactory performance [8], [13]–[15]. Distributed process monitoring frameworks are also developed for plantwide processes [16]–[18].

    In this paper, we propose a new regularized latent variable regression (rLVR) algorithm to build the relation between process and quality data, which is designed to have consistent outer and inner modeling objectives. Different from PLS and CCA, rLVR pays attention to both correlation betweenXandYand the variance ofY, which achieves better prediction power.The geometric properties and model relations of rLVR are analyzed, which reveals the orthogonality of extracted latent variables. An rLVR-based monitoring framework is also developed to monitor process-relevant and quality-relevant variations.

    The remainder of this paper is organized as follows. In Section II, the traditional latent variable methods, PLS and CCA, are reviewed. Section III presents the motivation and details of regularized LVR algorithm. The geometric properties and model relations of rLVR and its equivalence with PLS and CCA are demonstrated in Section IV. The comprehensive monitoring scheme based on rLVR is developed in Section V. Section VI employs two case studies,a numerical simulation and the Tennessee Eastman process, to illustrate the effectiveness of rLVR over PLS and CCA from prediction and monitoring perspectives. The conclusions are drawn in the last section.

    II. Latent Structured Methods

    A. Projection to Latent Structures

    Projection to latent structures (PLS), which is also referred to as partial least squares, is a typical supervised dimensionality reduction algorithm. It constructs a lower-dimensional space by maximizing the covariance between the projections of process dataX∈Rn×mand quality dataY∈Rn×pin the latent space,wherenis number of training samples, andmandpare number of process variables and quality variables respectively. The objective of PLS is mathematically presented as

    where score vectorst∈Rnandu∈Rnare projections ofXandYon the latent space respectively, and weighting vectorsw∈Rmandq∈Rpare projecting directions. Equation (1) is also called the outer modeling objective of PLS, and its solution can be derived iteratively with the aid of Lagrange multipliers.

    In inner modeling, PLS builds a linear regression model betweentandu,

    wherebis regression coefficient, and ? is modeling error. The first part in the above equation denotes the prediction of input score, and the regression coefficientbcan be obtained by minimizing the modeling error betweentandt?

    B. Canonical Correlation Analysis

    Canonical correlation analysis (CCA) [21], also known as canonical variate analysis (CVA), works as a counterpart of PLS to extract latent structures by maximizing the correlation betweenXandY, whose objective is formulated as

    or equivalently,

    In contrast to the iterative way in PLS, the latent variables of CCA can be derived directly by singular value decomposition (SVD). The detailed procedure of CCA is summarized as

    1) Pre-processXandYto make them zero-mean and unitvariance;

    2) Perform SVD on scaledXandY,W=[w1,w2,...,wl]andQ=[q1,q2,...,ql],

    3) Perform SVD to calculate the weighting matrices

    The deflation of CCA can be obtained similar to PLS by minimizing | |X?TPT|| a nd | |Y?TCT||, leading to

    whereT=XW, andPandCare loading matrices for process and quality variables in CCA.

    It is known that CCA is sensitive to noise in the presense of strong collinearity. Thus, regularized CCA (rCCA) was proposed to address the ill-conditioned performance by designing two regularization terms in both process and quality sides [12].

    C. Several Notes for PLS and CCA

    In (1), PLS pays attention to both correlation betweenXandYand their variances, and the extracted latent variables contain irrelevant or orthogonal information, which makes no contributions to predictingY. As a consequence, PLS needs superfluous number of latent variables. For instance, multiple latent variables are required to predict only one-dimensional quality dataY. Thus, further decomposition is designed in the subsequent works, such as total PLS [9] and concurrent PLS[10]. Another issue involved with PLS is the inconsistent objectives for outer and inner modeling, as observed in (1) and(2).

    CCA achieves better prediction or modeling performance by focusing on correlation between process and quality variables only. However, CCA attaches equal importance to process and quality variables, and the modeling performance can be further improved by incorporating the variances and signals of quality variablesY.

    Therefore, motivated by the aforementioned analysis, we propose the latent variable regression (LVR) method in the next section [22].

    III. Latent Variable Regression

    A. Outer Model

    In LVR, in order to make inner and outer modeling consistent, the following outer relation is designed.

    where the symbols have the meaning as in (1). It is noted that the constraint in (5) is different from those for PLS, which is designed on purpose and will be explained in the following subsections.

    The solution of (5) can be obtained with Lagrange multiplier λqas follows:

    Taking derivative with respect towandqand setting the results to zero result in

    By re-arranging the above equations, we have

    wherewTandqTare pre-multiplied in (6) and (7), which leads to

    Lemma 1:The least squares objective for latent variable regression in (5) is equivalent to minimizing the Lagrange multiplier λq.

    Proof:In (5),Joutercan be expanded as

    Thus, minimizing the prediction error between projection scorestanduin LVR is equivalent to finding the minimum

    B. Inner Model

    Equations (8) and (9) constitute the outer modeling of LVR.For inner modeling, the same objective is applied; that is to minimize the least squares,

    which leads to

    Remark 1:The inner modeling is not needed in latent variable regression method.

    It is noted that Remark 1 is an expected result due to the same outer and inner objectives in (5) and (10).

    C. Deflation of XandY

    Deflation ofXandYis performed to remove the effects of extracted latent variables, which can be represented as

    wherep∈Rmandc∈Rpare loading vectors forXandYrespectively, and they can be calculated by minimizing the regression errors | |X?t pT||2and | |Y?tcT||2, leading to

    Therefore, the procedure to extract latent variables in LVR is summarized as follows.

    1) Scale process and quality dataXandYto zero mean and unit variance.

    2) Initializeuas the first column ofY, and repeat the following relations until convergence is achieved.

    3) DeflateXandYby

    4) Conduct Steps 2 and 3 for next round untilllatent variables are extracted, wherelis determined by cross validation.

    D. Regularized Latent Variable Regression

    In LVR, inversion of the covariance ofXis involved to calculate weighting vectorw, and the collinearity inXwill lead to inconsistent results, which is shown in Fig. 1. In Fig. 1(a), two columns ofXare tightly correlated, and since the angle betweenx1andx2is not zero, Plane 1 is formed. In this case, quality variableyis able to project onto Plane 1, and the projection isy′.However, in most cases, data is subject to noise, which will makex1deviate from its original direction, andis the resulting direction.In Fig. 1(b), the new plane, Plane 2,defined byandx2drastically diverges from Plane 1,and the new projection y? in Plane 2 is also different fromy′. As concluded from Fig. 1, when the data are strongly collinear, the results of LVR are not reliable and consistent. Thus, it is necessary to address the collinear issues, which can be achieved by constraining the norm ofwwith a regularization term.

    Lemma 2:The LVR objective in (5) is equivalent to the following objective [23]:

    Fig. 1. Ill-conditioned performance caused by collinearity. (a) Projection of y ; b ) Large deviation of projection of y caused by noise.

    The proof details are given in Appendix A. The new objective in (13) for LVR is similar to the formulation for PLS in (1) or for CCA in (4); however, due to the different constraints, the derived solutions and their geometric properties are different. Since the objective in (13) is more conventional in process monitoring, it is adopted to develop the regularized LVR (rLVR) algorithm.

    A regularization term is designed in rLVR as follows to address the strong collinearity [22].

    where γ is regularized parameter.

    The detailed rLVR algorithm is summarized in Algorithm 1.There are two parameters involved in Algorithm 1, which areland κ, and they can be determined jointly with a cross validation.

    Algorithm 1 Regularized Latent Variable Regression

    IV. Geometric Properties and Model Relations

    A. rLVR Geometric Properties

    PCA, PLS and CCA extract their latent variables by maximizing a statistic metric (variance, covariance or correlation), and their geometric properties are well studied[2], [13], [24]. The idea of rLVR algorithm is different from PCA, PLS and CCA, and it is important to understand the structures of its latent space for further applications.

    For ease of illustration, a subscriptiis used to denote the iteration order. For instance,tide notes theith latent score,XiandYiare deflated process and quality datasets inith extraction round, and we have

    Then, with this denotation and the relations in Algorithm 1,we have the following lemma.

    Lemma 2:We have the following orthogonal properties between residuals and model parameters in regularized LVR algorithm:

    The proof of Lemma 2 is given in Appendix B. With the orthogonal relations in Lemma 2, it is straightforward to derive the orthogonality among model scores, weights and loadings, which is summarized in the following theorem.

    Theorem 1:The following orthogonal geometric properties hold for regularized LVR.

    Proof:To prove Relations 1 and 2, thepexpression in (15)is utilized

    For Relation 3, assuming thati>j, then

    wheni

    Additionally, from the Lagrange relations of (14), we have

    where λwand λqare Lagrange multipliers forwiandqirespectively. Thus, we have

    wherecis normalization coefficient. With the relations in Lemma 2 and assumingi>j

    Simil arly, whenTherefore, Relation 4 is proved.

    Theorem 1 shows that the scorestiare mutually orthogonal,and the deflation of process and quality datasets can be represented as

    Equation (20) implies that in order to calculatewandqin(6) and (7), only one dataset needs to be deflated for further iterations.

    B. rLVR Model Relations

    After performing rLVR, the process and quality data can be predicted by

    whereandQ=[q1,q2,...,ql]. BothPandQare available from training stage, whileTvaries with process dataX. Thus, it is necessary to derive the explicit relation betweenXandT.

    According to (39) in Appendix B,Xican be re-arranged into

    whereThen each score vectortiinTis calculated by

    whereri≡N1:i?1wi. With Relations 1 and 2 in Theorem 1, it is easily to show that

    whereR=[r1,r2,...,rl], andW=[w1,w2,...,wl]. Therefore,Tand predictions ofXandYcan be calculated from process data directly.

    The following properties hold forRandP.

    Lemma 3:PRTandI?PRTare idempotent matrices. That is

    The proof of Lemma 3 is provided in Appendix C. Lemma 3 demonstrates that bothPRTandI?PRTare orthogonal projection matrices.

    In online prediction, the prediction of quality data is calculated from the new samplexdirectly.

    and the new sampleXcan be modeled as

    Theorem 2:Regularized latent variable regression algorithm

    Proof:From (27), we have

    with Lemma 1, the first item in (29) is

    Similarly, the second item is

    Therefore

    C. Relation Among PLS, CCA and LVR

    A generalized formulation of PLS, CCA [21] and LVR can be derived as [25]

    where 0 ≤αw,αq≤1.

    PLS, CCA and LVR are three special cases of (30):

    1) When αw=1 and αq=1, (30) reduces to PLS;

    2) When αw=0 and αq=0, (30) reduces to CCA, and the constraints of ||w||=1 and ||q|| = 1 are equivalent to adding a regularization term forXandYrespectively;

    3) When αw=0 and αq=1, (30) stands for LVR, and the regularization term is incorporated with the extra constraint||w||=1.

    Geometrically, for ease of comparison, the objective of PLS, CCA and LVR are re-arranged as

    where θ is the angle betweenuandt, and for simplicity, the regularization term is omitted in the discussion for LVR. The geometric relations among PLS, CCA and LVR are presented in Fig. 2.

    Fig. 2. The geometric relations among PLS, CCA and LVR.

    As discussed in Section II, in addition to the relation between scoresuandt, PLS also emphasizes the variances ofXandYas shown in Fig. 2, and the extracted latent variables obtains less effective prediction power.

    In contrast, CCA is to maximize the correlation between the projections ofXandYin the latent space, thus it only focuses on the angle betweenuandt. CCA works well for prediction;however, since the variances of process and quality spaces are not exploited, further decompositions are necessary for good monitoring performance [12].

    The proposed LVR algorithm maximizes the projections of quality scores on the latent space, and both variance ofYand the angle betweenuandtare considered, leading to a better prediction effectiveness.

    V. Process Monitoring With rLVR

    It is important to develop a monitoring system based on the extracted latent variables in rLVR to detect anomalies in both principal component subspace (PCS) and residual subspace(RS).

    The variations in PCS are relevant to quality variables, and contain large variances. Assuming that they are normally distributed,T2index can be utilized to monitor qualityrelevant variations in this subspace [1]. For a new samplex,itsT2is calculated by

    wheret=RT x, and Λ=1/(n?1)TTTis a diagonal matrix.The threshold forT2can be defined as

    whereFl,n?l,αdenotes anF-distribution withlandn?ldegrees of freedom, and α defines the confidence interval.

    The information contained in the residual space is not related to quality variables, but it is still beneficial to monitor the variations in RS for the sake of operation efficiency and safety. It is not appropriate to useQstatistic directly as in PCA [1], [2], since the variances in the RS space may still be very large. Thus, a subsequent PCA decomposition is applied in the residualto extract the latent structure in RS space.

    wher eandcan be monitored withT2andQindices respectively.

    The detailed monitoring statistics and their corresponding thresholds are summarized in Table I, and the monitoring scheme is as follows.

    1)IfT2exceeds its control limit, a quality-relevant fault isdetected with ( 1?α)×100% confidence.

    TABLE I Monitoring Statistics and Control Limits

    2) Ifis larger than its control limit, a process-relevant fault is detected with (1?α)×100% confidence, and the fault will not affect quality variables.

    3) IfQrexceeds its threshold, the relation between process and quality variables might be broken, which needs further investigation.

    Remark 2:When the quality measurements are available, a similar decomposition can be further applied to the residual of quality variablesand the corresponding monitoring scheme can be developed, which is referred to as concurrent monitoring as developed for PLS [10] and CCA [12].

    VI. Case Studies

    A. Case Studies on Simulation Data

    To verify the effectiveness and robustness of LVR, we generate two scenarios in this section, and collinearity is introduced in both scenarios. rLVR, regularized CCA (rCCA)[12] and PLS are performed on the first scenario, and their performance is compared in terms of correlation coefficient,and proportion of variance explained of process and quality variables. In Scenario II, different noise levels are designed to show the robustness of rLVR.

    1) Scenario 1:The following expressions are used to generate data for the first scenario.

    where

    wheree∈R5~N(0,0.22),v∈R4~N(0,0.82), andt∈R5~N(0,(3×i)2),i∈[1,2,...,5]. It is noted that strong collinearity is introduced in bothXandY, where the 2nd and 4th columns and 3rd and 5th columns ofA, and the 2nd and 4th rows ofCare highly dependent.

    800 samples are generated with (37), and the first 600 ones are used as training data while the remaining ones are for test data. The model parameters are selected through cross validation: for rLVR,l=3, and κ=0.001; for rCCA,l=3,κx=0.001, and κy=0.059; and for PLS,l=5.

    The correlation coefficientr, and proportion of variance explained of process variables (PVEx) and quality variables(PVEy) of these models are shown in Figs. 3–5. As presented in Fig. 3, since rLVR pays attention to both correlation betweenXandYand variance ofY, its correlation coefficient and PVEyare highest for the first latent component, leaving less information in the residuals. rCCA focuses on maximizing the correlation between process and quality data. Thus, itsrfor each latent variable is relatively high; however, its ability to exploit process and quality variances is weak, which requires further processing. PLS in Fig. 5 tries to incorporate all three factors (correlation, process variance and quality variance), but the regression relation betweenXandYin PLS model is weakest among the three models, and it requires five principal components to achieve good performance.

    Fig. 3. Correlation coefficient and proportion of variance explained for rLVR in Scenario 1.

    Fig. 4. Correlation coefficient and proportion of variance explained for rCCA in Scenario 1.

    Fig. 5. Correlation coefficient and proportion of variance explained for PLS in Scenario 1.

    2) Scenario 2:The same formulation in (37) is adopted in Scenario II, and the data are generated with different levels of noisevas follows:

    Through cross validation, the model parameters are selected asl=3 and κ=0.001 for rLVR andl=3 for LVR in both cases. To compare the robustness of the models, the following metric is defined [12], which denotes the angle between weighting vectorsrifor different magnitudes of noise.

    wherei∈[1,2,3]. The results of LVR and rLVR are summarized in Table II. As observed from the table, when the noise level increases, the angles between weighting vectors for rLVR is small, and its constructed latent structure is consistent. However, LVR is sensitive to noise, and the resulting angles diverge, which is illustrated in Fig. 1.

    The regularization term in rLVR handles strong collinearcases. However, the value of the regularized parameterκ should not be too large; otherwise, it will have a negative effect on the prediction performance. As shown in Fig. 6, with the increasing values of κ, the mean squared errors (MSEs) of quality variables increase as well, whereYi(i∈{1,2,3,4})denotes theith quality variable. Additionally, an appropriate number of latent variableslis also important for the effectiveness of rLVR: Iflis too small, then the extracted latent variables cannot exploit process and quality spaces fully, leading to a sub-optimal prediction and monitoring performance; On the other hand, iflis too large, the extra latent factors tend to introduce noises into models, which may have negative effects on system modeling. Therefore, it is essential to employ cross-validation to determine the values of κandl.

    TABLE II Angles Between r for LVR and rLVR (°)

    Fig. 6. MSEs of quality variables with increasing κ.

    B. Case Study on the Tennessee Eastman Process

    In this section, the Tennessee Eastman (TE) process [26] is utilized to further demonstrate the effectiveness of the proposed algorithm. TE process was created by the Eastman Chemical Company for the purpose of developing and evaluating the techniques proposed in the process systems engineering field.The process involves five main components, which are reactor,condenser, stripper, compressor and separator. The reactions occurring in the reactor are

    where reactantsA,C,DandEare gases, and main productsGandHand byproductFare liquids.

    Two blocks of data are available in the TE process, which are process measurements XMEAS(1–41) and manipulated variables XMV(1–12). The detailed description of these variables are summarized in [26]. In this case study,XMEAS(1–22) and XMV(1–11) are selected as process variables, and XMEAS(35–36) are chosen as quality variables. Cross validation is applied to choose the model parameters: for rLVR,l=1 and κ=0.013; for CCA,l=1,κx=0.1, and κy=0.001; and for PLS,l=4. In this case study,the regularized LVR and CCA are employed for performance comparison to address the collinearity in the TEP data.

    Downs and Vogel [26] simulated 20 disturbances for further analysis, and two typical ones are selected to compare the performance of rLVR, PLS and CCA in our work, which are IDV(1) and IDV(4).

    1) IDV(1) – a Step Disturbance on A/C Feed Ratio:While keeping the composition ofBconstant, a step change is introduced onA/Cfeed ratio in IDV(1). Figs. 7–9 show the prediction performance of rLVR, PLS and CCA respectively.It is noted that quality variables XMEAS(35) and XMEAS(36) are denoted asY1andY2respectively in the figures. In PLS, a gap exists consistently for XMEAS(36)even after the process returns to normal with the effect of controllers. In contrast, both rLVR and CCA work well to predict the variations or trends of quality variables, with rLVR slightly better in terms of MSEs as shown in Table III. It is noted that rLVR cannot fully follow the trend of quality variables when the disturbance is introduced, and this is caused by the dynamics in the process, which will be addressed with a dynamic extension of rLVR in a future work.

    Fig. 7. Prediction results of rLVR for IDV(1).

    Fig. 8. Prediction results of PLS for IDV(1).

    Fig. 9. Prediction results of CCA for IDV(1).

    TABLE III MSEs of rLVR, CCA and PLS for IDV(1)

    The monitoring performance of rLVR and PLS are shown in Figs. 10 and 11, and CCA’s performance results are omitted due to its negligible differences from rLVR in this data. In Figs. 10 and 11,andQrdenote the monitoring indices for principal component subspace, process principal subspace,and process residual subspace, respectively, whileis the quality monitoring index from performing PCA on quality variables directly. As observed from the figures, for both rLVR and PLS,andQrrespond more quickly than quality monitoringwhere black vertical line denotes the timestamp when disturbance is introduced. Aligning with the prediction results, PLS tends to return to normal after the tuning of controllers, but the false alarms are still consistently raised after Sample 200. In contrast, rLVR follows the quality trends better than PLS, and only process relevant faults are detected inandQrwith lower level of importance.Therefore, due to the emphasis of quality information in modeling phase, rLVR-based prediction and monitoring perform better than PLS.

    Fig. 10. Monitoring results of rLVR for IDV(1).

    Fig. 11. Monitoring results of PLS for IDV(1).

    2) IDV(4) – a Step Change in Reactor Cooling Water Inlet Temperature:In IDV(4), due to the correction of controllers,the quality variables are not affected, and their variations and the predictions of rLVR, PLS and CCA are presented in Figs. 12–14. In terms of prediction performance, rLVR, PLS and CCA achieve comparable results as summarized in Table IV with PLS performing the worst. As validated byvariations in Figs. 15 and 16, the disturbance in IDV(4) is quality-irrelevant. However, PLS raised many false alarms for quality-relevant faults withT2statistic, which reduces the reliability of the fault detection system. The monitoring results for rLVR in Fig. 15 are more credible, which indicate the disturbance affects process variables only.

    Fig. 12. Prediction results of rLVR for IDV(4).

    Fig. 13. Prediction results of PLS for IDV(4).

    Fig. 14. Prediction results of CCA for IDV(4).

    TABLE IV MSEs of rLVR, CCA and PLS for IDV(4)

    VII. Conclusions

    In this paper, a new regularized latent variable regression(rLVR) method is proposed for multivariate modeling and process monitoring. rLVR aims to maximize the projection of quality variables on the latent spaces, which is shown to be equivalent to minimizing the prediction error between process and quality scores. The geometric properties and model relations are derived and summarized for rLVR, and the relation among rLVR, PLS and CCA is analyzed both theoretically and geometrically. The process monitoring framework based on rLVR is developed to detect anomalies in principal component subspace, process principal subspace and process residual subspace. Two case studies, numerical simulations and the Tennessee Eastman process, are employed to demonstrate the effectiveness of rLVR over PLS and CCA in terms of prediction and monitoring.

    Fig. 15. Monitoring results of rLVR for IDV(4).

    Fig. 16. Monitoring results of PLS for IDV(4).

    Appendix A Proof of Lemma 1

    With relation in (8), the objective of LVR in (5) can be rearranged as

    where θ is the angle betweenuandt, and its range is [ 0,180?].Additionally, since the direction oftorumakes no difference on the minimum value ofJ, the range of θ can be further restricted to [ 0,90?]. Therefore, the objective is equivalent to

    By substitutingt=Xwandu=Yq, the equivalence between (5) and (13) is proved.

    Appendix B Proof of Lemma 2

    1) From Algorithm 1 and (16), we have

    wher e

    Then the first item in Lemma 2 can be proved by

    2) Another way to representXiis

    where

    Additionally, provingis equivalent to show

    It is noted that the last two items in Lemma 2,andcan be proved in a similar way to the first two items; thus, their proof is omitted in the paper.

    The loading matrixPcan be expressed as

    Appendix C Proof of Lemma 3

    ThenRT Pis proved to be an identity matrix as follow:

    Thus,

    老熟妇乱子伦视频在线观看| 国产美女午夜福利| 日韩强制内射视频| 欧美精品国产亚洲| 欧美精品国产亚洲| 床上黄色一级片| 久久精品影院6| 亚洲熟妇熟女久久| 国产精品三级大全| 男女视频在线观看网站免费| 日韩欧美在线二视频| 91在线观看av| 亚洲精品亚洲一区二区| 久久精品国产自在天天线| 亚洲中文字幕一区二区三区有码在线看| 久久九九热精品免费| 精品久久久久久久末码| 欧美黑人巨大hd| 91在线精品国自产拍蜜月| 女人十人毛片免费观看3o分钟| 精品免费久久久久久久清纯| 色播亚洲综合网| 欧美成人性av电影在线观看| 亚洲欧美日韩卡通动漫| 嫩草影院新地址| 91在线精品国自产拍蜜月| 亚洲欧美日韩卡通动漫| 亚洲欧美清纯卡通| 久久人妻av系列| 网址你懂的国产日韩在线| 成人毛片a级毛片在线播放| 精品人妻视频免费看| 国产精品伦人一区二区| 一区福利在线观看| 成年女人看的毛片在线观看| 亚洲av成人精品一区久久| 热99re8久久精品国产| 在线观看午夜福利视频| 高清在线国产一区| 国产精品久久久久久久电影| 亚洲内射少妇av| 好男人在线观看高清免费视频| 最近最新中文字幕大全电影3| 熟女电影av网| 国产精品人妻久久久影院| 18禁裸乳无遮挡免费网站照片| 亚洲美女搞黄在线观看 | 久久久久国产精品人妻aⅴ院| 午夜福利18| 日韩强制内射视频| 色视频www国产| 真人做人爱边吃奶动态| 日本五十路高清| 琪琪午夜伦伦电影理论片6080| 国产淫片久久久久久久久| 天堂网av新在线| 熟女电影av网| 99在线人妻在线中文字幕| 成年人黄色毛片网站| 国产aⅴ精品一区二区三区波| .国产精品久久| 男女下面进入的视频免费午夜| 男女之事视频高清在线观看| 国产久久久一区二区三区| 国内精品一区二区在线观看| avwww免费| 丝袜美腿在线中文| 久久久久精品国产欧美久久久| 亚洲精品一卡2卡三卡4卡5卡| 日韩中字成人| 亚洲精品久久国产高清桃花| 精品久久久噜噜| 成人特级黄色片久久久久久久| 丰满的人妻完整版| 18禁在线播放成人免费| 国产亚洲精品久久久com| 亚洲精品久久国产高清桃花| 久久久久国内视频| 毛片一级片免费看久久久久 | 观看美女的网站| 精品免费久久久久久久清纯| av专区在线播放| 国产探花在线观看一区二区| 亚洲avbb在线观看| 国产精品永久免费网站| 亚洲欧美日韩高清在线视频| 欧美最黄视频在线播放免费| 色综合婷婷激情| 白带黄色成豆腐渣| 久久热精品热| ponron亚洲| 男女下面进入的视频免费午夜| 69人妻影院| 中出人妻视频一区二区| 不卡视频在线观看欧美| 无遮挡黄片免费观看| 一个人免费在线观看电影| 成人鲁丝片一二三区免费| av在线亚洲专区| 亚洲乱码一区二区免费版| 国产精品自产拍在线观看55亚洲| 亚洲av不卡在线观看| 国产精品国产高清国产av| 国内揄拍国产精品人妻在线| 国产精品嫩草影院av在线观看 | 少妇丰满av| 久久久久久久精品吃奶| 在线播放国产精品三级| 亚洲av五月六月丁香网| 三级男女做爰猛烈吃奶摸视频| 国产淫片久久久久久久久| 成熟少妇高潮喷水视频| 欧美成人免费av一区二区三区| 老师上课跳d突然被开到最大视频| 日日夜夜操网爽| 国产色爽女视频免费观看| 国产激情偷乱视频一区二区| 老司机深夜福利视频在线观看| 国语自产精品视频在线第100页| 国产精品国产高清国产av| 婷婷精品国产亚洲av| 三级国产精品欧美在线观看| 婷婷精品国产亚洲av在线| 日韩中字成人| 亚洲自偷自拍三级| a级毛片免费高清观看在线播放| 嫩草影院入口| 国产美女午夜福利| 精品久久久久久久久久久久久| av国产免费在线观看| 免费观看在线日韩| av在线蜜桃| 欧美+亚洲+日韩+国产| 99久久九九国产精品国产免费| 真人做人爱边吃奶动态| 国产黄色小视频在线观看| 国产精品日韩av在线免费观看| 观看美女的网站| 午夜精品久久久久久毛片777| 国产探花在线观看一区二区| 3wmmmm亚洲av在线观看| 日韩欧美精品免费久久| 亚洲国产日韩欧美精品在线观看| 精品国产三级普通话版| 国产精品久久视频播放| 啦啦啦观看免费观看视频高清| 色5月婷婷丁香| 22中文网久久字幕| 国产一区二区激情短视频| 欧美黑人巨大hd| 久久久久久久亚洲中文字幕| 91久久精品电影网| 天美传媒精品一区二区| 国产精品久久久久久久久免| 国产视频一区二区在线看| 亚洲自偷自拍三级| 欧美3d第一页| 精品一区二区三区视频在线观看免费| 免费观看人在逋| 亚洲国产精品久久男人天堂| 国产午夜福利久久久久久| 精品久久久久久久久av| 成人鲁丝片一二三区免费| 成人高潮视频无遮挡免费网站| 久久婷婷人人爽人人干人人爱| 国模一区二区三区四区视频| 免费av观看视频| 精品久久久久久,| 免费人成视频x8x8入口观看| 国产欧美日韩精品一区二区| 日韩欧美一区二区三区在线观看| 国产美女午夜福利| av福利片在线观看| 九九热线精品视视频播放| 性欧美人与动物交配| 网址你懂的国产日韩在线| 国产精品人妻久久久影院| 中文字幕精品亚洲无线码一区| 国产精品乱码一区二三区的特点| 在线免费观看的www视频| 久久久国产成人免费| 色5月婷婷丁香| 久久精品国产亚洲av香蕉五月| 精品欧美国产一区二区三| 国产精品不卡视频一区二区| 最新中文字幕久久久久| 三级毛片av免费| 亚洲最大成人中文| 国产成人福利小说| 一卡2卡三卡四卡精品乱码亚洲| 精品一区二区三区视频在线观看免费| 亚洲性夜色夜夜综合| 亚洲人成伊人成综合网2020| 亚洲国产精品久久男人天堂| 日韩强制内射视频| 亚洲欧美日韩卡通动漫| 禁无遮挡网站| 国产v大片淫在线免费观看| 日本黄色片子视频| 最近最新中文字幕大全电影3| 久久久久免费精品人妻一区二区| 国产极品精品免费视频能看的| 成人国产麻豆网| 亚洲无线观看免费| 日韩中文字幕欧美一区二区| 国产一区二区在线观看日韩| 最近视频中文字幕2019在线8| 国产精品嫩草影院av在线观看 | 国产伦一二天堂av在线观看| 日韩精品青青久久久久久| 亚洲av一区综合| 直男gayav资源| 国产精品综合久久久久久久免费| 亚洲精华国产精华精| 成人美女网站在线观看视频| 精华霜和精华液先用哪个| 国产精华一区二区三区| 看免费成人av毛片| 淫秽高清视频在线观看| 亚洲av成人精品一区久久| 自拍偷自拍亚洲精品老妇| 嫩草影院入口| 国产伦一二天堂av在线观看| 有码 亚洲区| 中出人妻视频一区二区| 久久精品人妻少妇| 性色avwww在线观看| 日韩在线高清观看一区二区三区 | 日本黄色片子视频| 我的老师免费观看完整版| 91在线精品国自产拍蜜月| 久久久精品大字幕| 久久精品夜夜夜夜夜久久蜜豆| 最近在线观看免费完整版| 国产免费男女视频| 久久久久久久午夜电影| 99热这里只有精品一区| 国产男靠女视频免费网站| 午夜福利在线观看吧| 亚洲中文日韩欧美视频| 波多野结衣高清作品| 男人狂女人下面高潮的视频| 成人毛片a级毛片在线播放| 天天躁日日操中文字幕| 欧美日韩国产亚洲二区| 国国产精品蜜臀av免费| 中文在线观看免费www的网站| 97碰自拍视频| 国产精华一区二区三区| 亚洲色图av天堂| 久久精品综合一区二区三区| 99热这里只有是精品50| 精品国产三级普通话版| 免费看av在线观看网站| 午夜福利高清视频| 男插女下体视频免费在线播放| 免费看光身美女| 99在线人妻在线中文字幕| 亚洲人成网站在线播| 国产爱豆传媒在线观看| 男插女下体视频免费在线播放| 日本熟妇午夜| 九九热线精品视视频播放| 国产色爽女视频免费观看| 97人妻精品一区二区三区麻豆| 国产欧美日韩一区二区精品| 中文亚洲av片在线观看爽| 久久精品影院6| 在线看三级毛片| 成人永久免费在线观看视频| 精品人妻偷拍中文字幕| 变态另类丝袜制服| 美女cb高潮喷水在线观看| 黄色丝袜av网址大全| 欧美激情国产日韩精品一区| 国产熟女欧美一区二区| 一a级毛片在线观看| 俄罗斯特黄特色一大片| 国产日本99.免费观看| 欧美高清成人免费视频www| 国产高清视频在线播放一区| 床上黄色一级片| av女优亚洲男人天堂| 国产色婷婷99| 亚洲中文字幕一区二区三区有码在线看| 天堂√8在线中文| 国产真实乱freesex| 啦啦啦观看免费观看视频高清| 我的老师免费观看完整版| 国国产精品蜜臀av免费| 久久99热6这里只有精品| 熟妇人妻久久中文字幕3abv| 97人妻精品一区二区三区麻豆| 人妻少妇偷人精品九色| 在线a可以看的网站| 国产黄色小视频在线观看| 啦啦啦韩国在线观看视频| 日本 av在线| 欧美色欧美亚洲另类二区| 亚洲中文字幕一区二区三区有码在线看| 欧美zozozo另类| 中文字幕高清在线视频| 精品久久国产蜜桃| 淫妇啪啪啪对白视频| avwww免费| 麻豆av噜噜一区二区三区| 国产高清视频在线播放一区| 国内精品宾馆在线| 九九在线视频观看精品| 嫩草影院入口| 看十八女毛片水多多多| 69av精品久久久久久| 尾随美女入室| 久久久久久大精品| 在线国产一区二区在线| 成年女人永久免费观看视频| 国产精品电影一区二区三区| 国产亚洲91精品色在线| 99热这里只有是精品50| 亚洲av美国av| 日韩欧美国产在线观看| 国产精品av视频在线免费观看| 欧美3d第一页| 国产午夜精品久久久久久一区二区三区 | 久久久成人免费电影| xxxwww97欧美| 国产av在哪里看| 久久久国产成人免费| 欧美三级亚洲精品| 男插女下体视频免费在线播放| 免费无遮挡裸体视频| 免费看美女性在线毛片视频| 国产亚洲精品综合一区在线观看| 国产精品三级大全| 国产欧美日韩精品一区二区| 亚洲精品粉嫩美女一区| 美女被艹到高潮喷水动态| 亚洲精品色激情综合| 级片在线观看| 久久精品国产清高在天天线| 中文字幕人妻熟人妻熟丝袜美| 男人舔女人下体高潮全视频| 精品99又大又爽又粗少妇毛片 | 真实男女啪啪啪动态图| 亚洲欧美日韩无卡精品| 男女视频在线观看网站免费| 亚洲人成网站在线播| 亚洲中文字幕日韩| 亚洲国产欧洲综合997久久,| 久9热在线精品视频| 亚洲三级黄色毛片| 成年版毛片免费区| 亚洲av美国av| 99久久精品国产国产毛片| 国产色婷婷99| 亚洲美女黄片视频| 色播亚洲综合网| 老女人水多毛片| 亚洲av免费在线观看| 一本久久中文字幕| a级一级毛片免费在线观看| 日本与韩国留学比较| 免费人成视频x8x8入口观看| 国内久久婷婷六月综合欲色啪| 精品一区二区三区视频在线观看免费| 观看美女的网站| 老师上课跳d突然被开到最大视频| 久久国产乱子免费精品| 蜜桃亚洲精品一区二区三区| 少妇人妻一区二区三区视频| 特大巨黑吊av在线直播| 久久精品久久久久久噜噜老黄 | 热99在线观看视频| 九九爱精品视频在线观看| 亚洲成人中文字幕在线播放| 99久国产av精品| 午夜福利18| 国产69精品久久久久777片| 99国产极品粉嫩在线观看| 三级毛片av免费| 久久精品人妻少妇| 在线看三级毛片| 国产精品福利在线免费观看| 很黄的视频免费| 欧美+亚洲+日韩+国产| 可以在线观看的亚洲视频| 一区二区三区高清视频在线| bbb黄色大片| 91午夜精品亚洲一区二区三区 | 97人妻精品一区二区三区麻豆| 干丝袜人妻中文字幕| 自拍偷自拍亚洲精品老妇| 成年人黄色毛片网站| 久久久久久久久久成人| 精品久久久久久成人av| 午夜日韩欧美国产| 亚洲av电影不卡..在线观看| 一夜夜www| 亚洲精品色激情综合| 麻豆国产97在线/欧美| 亚洲欧美日韩无卡精品| 亚洲精品一区av在线观看| 欧美日韩中文字幕国产精品一区二区三区| 国产亚洲av嫩草精品影院| 久久精品国产亚洲av天美| 伦精品一区二区三区| 不卡一级毛片| 我的老师免费观看完整版| 亚洲18禁久久av| 国产人妻一区二区三区在| 精品一区二区三区av网在线观看| 毛片女人毛片| 九九爱精品视频在线观看| 亚洲精品一卡2卡三卡4卡5卡| 久久精品国产亚洲av香蕉五月| 亚洲欧美日韩卡通动漫| 麻豆精品久久久久久蜜桃| 国产色婷婷99| 国产老妇女一区| 一级a爱片免费观看的视频| 我的女老师完整版在线观看| 欧美日韩乱码在线| 国内精品宾馆在线| 国产探花在线观看一区二区| 免费观看在线日韩| 国产成年人精品一区二区| 熟女人妻精品中文字幕| 99久久精品一区二区三区| 91午夜精品亚洲一区二区三区 | 国产精品久久久久久久电影| 精品午夜福利在线看| 三级毛片av免费| 亚洲欧美日韩东京热| 国产精品久久久久久精品电影| 日韩欧美精品免费久久| 国产精品永久免费网站| 草草在线视频免费看| 亚洲国产精品久久男人天堂| 国产精品一及| 成人综合一区亚洲| 国产国拍精品亚洲av在线观看| 免费观看精品视频网站| 全区人妻精品视频| 男插女下体视频免费在线播放| 精品不卡国产一区二区三区| 国产亚洲av嫩草精品影院| 一进一出抽搐gif免费好疼| 麻豆国产av国片精品| 国产高清视频在线播放一区| 国产中年淑女户外野战色| 97碰自拍视频| 在线看三级毛片| 国产精品无大码| 欧美成人免费av一区二区三区| 日本 av在线| 天堂影院成人在线观看| 亚洲18禁久久av| 久久久久久久久中文| 搡老妇女老女人老熟妇| 免费av不卡在线播放| 午夜免费激情av| 身体一侧抽搐| 久久精品国产亚洲网站| 婷婷色综合大香蕉| 国产在视频线在精品| 精品午夜福利在线看| 色综合婷婷激情| 欧美性感艳星| 精品福利观看| 性色avwww在线观看| 欧美日韩黄片免| 国产美女午夜福利| 午夜影院日韩av| 国产v大片淫在线免费观看| 丰满的人妻完整版| 大又大粗又爽又黄少妇毛片口| 久久久久久久午夜电影| 熟妇人妻久久中文字幕3abv| 最近最新免费中文字幕在线| 精品一区二区三区人妻视频| 久久精品国产亚洲网站| 久久这里只有精品中国| 亚洲在线观看片| 亚洲av日韩精品久久久久久密| 国产伦一二天堂av在线观看| 日本 欧美在线| 国产午夜福利久久久久久| 非洲黑人性xxxx精品又粗又长| 无遮挡黄片免费观看| 亚洲美女搞黄在线观看 | 国产成人影院久久av| 国产免费一级a男人的天堂| x7x7x7水蜜桃| 男人舔女人下体高潮全视频| 十八禁国产超污无遮挡网站| 亚洲精品国产成人久久av| 欧美绝顶高潮抽搐喷水| 日本一二三区视频观看| 日韩人妻高清精品专区| 91精品国产九色| 国产精品国产三级国产av玫瑰| av专区在线播放| 亚洲av一区综合| 国产精品1区2区在线观看.| 成年人黄色毛片网站| 波多野结衣高清无吗| 人妻久久中文字幕网| 国产精品免费一区二区三区在线| 精品久久久久久久人妻蜜臀av| 免费观看人在逋| 国产亚洲精品久久久久久毛片| 精品人妻熟女av久视频| 日本色播在线视频| 国产aⅴ精品一区二区三区波| 成年女人永久免费观看视频| 天天一区二区日本电影三级| 春色校园在线视频观看| 国产精品久久久久久亚洲av鲁大| 中文字幕免费在线视频6| 亚洲av免费在线观看| 午夜日韩欧美国产| 久久午夜福利片| 久久久久久九九精品二区国产| 女人被狂操c到高潮| 丰满的人妻完整版| 欧美高清成人免费视频www| 久久久久久久精品吃奶| 国产免费一级a男人的天堂| 一级av片app| 99久久中文字幕三级久久日本| a级一级毛片免费在线观看| 69人妻影院| 亚洲国产高清在线一区二区三| 午夜激情欧美在线| 久久精品夜夜夜夜夜久久蜜豆| 窝窝影院91人妻| 一级毛片久久久久久久久女| 在线观看免费视频日本深夜| 美女 人体艺术 gogo| 男女做爰动态图高潮gif福利片| 亚洲三级黄色毛片| 男女之事视频高清在线观看| 欧美三级亚洲精品| 亚洲成a人片在线一区二区| 国产免费av片在线观看野外av| 99久久精品一区二区三区| 色播亚洲综合网| 日本免费一区二区三区高清不卡| 老司机福利观看| 九色成人免费人妻av| 国产探花极品一区二区| 国产亚洲精品久久久com| 人人妻人人澡欧美一区二区| 久久人妻av系列| 乱人视频在线观看| 在线a可以看的网站| 国产成人av教育| 精品国产三级普通话版| 精品一区二区三区av网在线观看| а√天堂www在线а√下载| 看免费成人av毛片| 国内精品宾馆在线| 精品久久久久久久久亚洲 | 舔av片在线| 国产高清不卡午夜福利| 夜夜夜夜夜久久久久| 深夜精品福利| 国产精品野战在线观看| 国产高清有码在线观看视频| 动漫黄色视频在线观看| 麻豆精品久久久久久蜜桃| 99热只有精品国产| 韩国av一区二区三区四区| 69人妻影院| 熟女电影av网| 啦啦啦啦在线视频资源| 亚洲精品色激情综合| 在线播放无遮挡| 男插女下体视频免费在线播放| 最好的美女福利视频网| 久99久视频精品免费| 美女大奶头视频| 亚洲av成人精品一区久久| 观看美女的网站| 国产一区二区三区视频了| 91精品国产九色| 国产免费av片在线观看野外av| 成人欧美大片| 一边摸一边抽搐一进一小说| 国产av在哪里看| 久久草成人影院| 国产亚洲91精品色在线| 久久久久国产精品人妻aⅴ院| 日韩高清综合在线| 久久久午夜欧美精品| 国产又黄又爽又无遮挡在线| 国产亚洲91精品色在线| 亚洲经典国产精华液单| 一级黄片播放器| 一级a爱片免费观看的视频| 99热精品在线国产| 女人十人毛片免费观看3o分钟| 在线观看舔阴道视频| 高清毛片免费观看视频网站| 色在线成人网| 日韩大尺度精品在线看网址| 久久久久久久精品吃奶| 精品人妻1区二区| 真实男女啪啪啪动态图| 嫩草影院精品99| 一区二区三区激情视频| 国产精品久久久久久久久免| 国产免费av片在线观看野外av| 伦精品一区二区三区| 国产精品一区www在线观看 |