Next Article in Journal
Statistical Inference for Periodic Self-Exciting Threshold Integer-Valued Autoregressive Processes
Previous Article in Journal
Assessment of Classification Models and Relevant Features on Nonalcoholic Steatohepatitis Using Random Forest
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Generalized Correntropy Criterion-Based Performance Assessment for Non-Gaussian Stochastic Systems

School of Control and Computer Engineering, North China Electric Power University, Beijing 102206, China
*
Author to whom correspondence should be addressed.
Submission received: 19 May 2021 / Revised: 11 June 2021 / Accepted: 15 June 2021 / Published: 17 June 2021

Abstract

:
Control loop performance assessment (CPA) is essential in the operation of industrial systems. In this paper, the shortcomings of existing performance assessment methods and indicators are summarized firstly, and a novel evaluation method based on generalized correntropy criterion (GCC) is proposed to evaluate the performance of non-Gaussian stochastic systems. This criterion could characterize the statistical properties of non-Gaussian random variables more fully, so it can be directly used as the assessment index. When the expected output of the given system is unknown, generalized correntropy is used to describe the similarity of two random variables in the joint space neighborhood controlled and take it as the criterion function of the identification algorithms. To estimate the performance benchmark more quickly and accurately, a hybrid-EDA (H-EDA) combined with the idea of “wading across the stream algorithm” is proposed to obtain the system parameters and disturbance noise PDF. Through the simulation of a single loop feedback control system under different noise disturbances, the effectiveness of the improved algorithm and new indexes are verified.

1. Introduction

With the continuous improvement of automation in the industrial process, the production process is becoming more and more complex. Control loops play the most important role in automation systems. Product quality, operation safety, material consumption and energy consumption are closely related to the performance of control system. Excellent control loop performance will ensure the effectiveness of the control systems, thus ensuring product quality and reducing product cost. The significance of performance assessment is to realize, restore and maintain the optimal performance of the control loops at all times. Therefore, it is of great practical value to evaluate the system performance quickly and accurately in industrial operation. At present, control loop performance assessment (CPA) has become an essential technology to ensure the smooth progress of industrial production [1].
Noise is unavoidable during the performance evaluation [2]. Most of the current performance evaluation methods assume that the noise obeys the Gaussian distribution. The research focuses on the first or second-order moments of the target output (i.e., the mean and variance). Harris [3] proposed a CPA index based on minimum variance control (MVC). This method is regarded as a milestone in the performance assessment research, so we named the index after the researcher, Harris. Later researchers have made great progress on this fundamental. The MVC-based performance evaluation index has been applied to various types of control systems, and this index has been improved for different situations [4,5,6]. At present, methods of system performance assessment based on MVC are very mature and has a remarkable application effect when the noise disturbance conforms to Gaussian. However, most of the disturbances in the actual industrial process obey non-Gaussian. The traditional MVC methods are no longer applicable for this case. A more suitable indicator should be chosen to reflect the higher-order statistical characteristics of variables rather than mean and variance in the closed-loop control systems.
For a stochastic system with non-Gaussian noises, the probability density function (PDF) of the output data is approximated by B-spline in the early research direction [7]. A stochastic distributed control algorithm based on minimum entropy control (MEC) was proposed by Zhang and Chu [8]. Information entropy has a more general meaning for any stochastic variable than mean or variance. All higher-order moments (not just second-order) can be minimized by minimizing entropy rather than mean square error. Therefore, the application effect of MEC control strategy is much better than MVC in non-Gaussian systems. On this basis, some significant progress has been made in the field of CPA with non-Gaussian noise [9,10,11,12,13,14]. When the expected output PDF is not given in advance, it is necessary for the CPA methods of the controlled system based on MEC to identify the associated system model effectively and systematically based on closed-loop data. Minimum entropy criterion was introduced into the feedback control system by Jiang [9], and a new performance assessment method came into being. In work [9], the model of the associated system is identified by the estimation of distribution algorithm (EDA) to obtain the optimal parameters, then the mathematical model and disturbance sequence between output and disturbance is obtained. It is verified that the new benchmark is appropriate for both Gaussian and non-Gaussian systems. However, the method given by Jiang [9] is not detailed enough, the specific calculation methods of minimum entropy for discrete and continuous disturbances are not given. Fortunately, an improved minimum entropy calculation method was proposed by Zhou [10,11] to correct the defects of Jiang’s method. In work [10], Zhou and Zhang analyzed the defects of Shannon entropy in CPA process and proposed a new entropy index, rational entropy. The method given in [10] can calculate the theoretical benchmark value. In work [11], the traditional EDA has been improved to identify the system parameters and estimate the noise PDF better and the estimation method of benchmark value is also given. In reference [12], the rational entropy index is used to evaluate a typical cascade control system. However, entropy has the property of translation invariance, that is, the size of entropy is only determined by the shape of its distribution. When the mean of error distribution is not around zero, the wrong assessment result may be caused. For this case, Zhou proposed the performance index of mean-constraint, while a CPA method based on mixture correntropy was proposed by Zhang [13]. The problem of the minimum entropy translation invariance can be solved by adjust the kernel width and weight coefficient of mixture correntropy. Similarly, to avoid the defect of Renyi entropy indicator is insensitive to mean shift, Zhang [14] combined the Renyi entropy benchmark with the mean value to construct a new indicator that can reflect the mean shift of non-Gaussian system. In conclusion, the main breakthrough of the performance assessment method based on MEC is the improvement of the identification algorithm and the selection of some new entropy indicators.
Correntropy is a measure of the generalized similarity of two random variables using information from the statistical properties of higher-order signals [15]. The kernel function of mixture correntropy CPA [13] is Gaussian kernel by default. However, it can only reflect a specific type of noise with Gaussian function, and the shape of entropy cannot be changed freely. Therefore, the Gaussian kernel is not always the best choice [16]. A more general correntropy, generalized correntropy (GC) is proposed. The generalized Gaussian density function is used to replace the traditional Gaussian kernel function for correlation analysis. With greater flexibility, generalized correntropy could describe the statistical characteristics of non-Gaussian random variables better and more fully. In recent years, generalized correntropy has been widely used to design the robust adaptive filters to adapt to different applications [17,18,19,20]. For instance, to improve the accuracy and stability of the traditional multi-kernel filtering algorithms, the multi-kernel generalized maximum correntropy (MKGMC) filtering algorithm is developed by Sun [17] under the generalized maximum correntropy criterion. Wang [18] applied generalized correntropy to the design of sparse Gaussian Hermite orthogonal filters. Qiu [19] replaced the mean square error loss with generalized correntropy loss (GCL) in the original unscented Kalman filter (UKF) framework, which combines the strength of GCL developed in robust information theoretic learning to solve the non-Gaussian interference and the strength of the UKF in handing strong model nonlinearities. Therefore, it is very promising to study the performance assessment of non-Gaussian stochastic systems by replacing entropy criterion with generalized correntropy.
To estimate the performance benchmark better, this paper integrates the idea of wading across the stream algorithm (WSA) into the traditional EDA to improve the local convergence ability of the algorithm. In the iterative process of the algorithm, the center of some excellent individuals is selected to cross with the best solution to fully retain the information of the best individuals. Generalized correntropy is used as the fitness value of H-EDA to identify the parameters of the controlled auto regression and moving average (CARMA) model of the controlled system, which accelerate the algorithm and improves the accuracy. The effectiveness of the hybrid algorithm is verified through the simulation of a single loop feedback control system under different noise disturbances.
In the next section, the CPA index based on MEC is reviewed, and the generalized correntropy index is proposed to overcome the shortcomings of the previous entropy indexes. The third part introduces the performance evaluation process based on the H-EDA in detail. In the fourth section, a detailed simulation of a single input single output (SISO) system is carried out, and the final conclusion will be given in the last part.

2. Performance Assessment Based on MEC

2.1. Feedback Control Loop

A SISO feedback control system is considered in this paper for convenience of research, the detailed description is given in Figure 1.
where, r represents the system set value; u represents the output of the controller; v represents stationary independent identically distributed unmeasurable noise.
The system in Figure 1 could be described as a CARMA model as follows,
A ( z 1 ) y ( k ) = z τ B ( z 1 ) u ( k ) + C ( z 1 ) v ( k )
where y ( k ) , u ( k ) and v ( k ) are the output, input and unknown noise disturbance of CARMA model process respectively,
{ A ( z 1 ) = 1 + a 1 z 1 + a 2 z 2 + + a n a z n a B ( z 1 ) = b 1 z 1 + b 2 z 2 + + b n b z n b C ( z 1 ) = 1 + c 1 z 1 + c 2 z 2 + + c n c z n c
where n a , n b , n c and τ are the structural parameters of the model; n a , n b , and n c are the order of A ( z 1 ) , B ( z 1 ) and C ( z 1 ) respectively, τ is the system delay. The parameters a n a , b n b , and c n c can be estimated by the model identification, provided the structural parameters are obtained.

2.2. Minimum Entropy Index

Assuming that the input of the given system is 0, the output is,
y t = G v 1 + G p G c v t = G v 1 + z τ G ˜ p G c v t
where, G ˜ p is the transfer function without delay. By solving the Diophantine equation, the disturbance transfer function G v could be further decomposed,
G v ( z 1 ) = F ( z 1 ) + z τ R ( z 1 ) = ( 1 + n 1 z 1 + n 2 z 2 + + n τ 1 z ( τ 1 ) ) + z τ R ( z 1 )
where, F ( z 1 ) is the impulse response coefficient of the disturbance transfer function G v and R ( z 1 ) is the remaining transfer function satisfying the identity (4),
y ( t ) = F v ( t ) + L v ( t τ ) = ( n 0 + n 1 z 1 + n 2 z 2 + + n v z ( τ 1 ) ) v ( t ) feedback invariant + ( n d z τ + n d + 1 z ( τ + 1 ) + ) v ( t ) feedback varying
where,
L = R F G ˜ p G c 1 + z τ G ˜ p G c
The control objective of the minimum entropy controller can be achieved by minimizing the entropy of the output variables when analyzing the linear non-Gaussian system [21,22]. The feedback-invariant only depend on the characteristics of the disturbance in the process rather than the function of the process model or controller. The second term on the right of Equation (5) is the feedback-varying, which means the controller G v can determine the entropy value of the process output. Since both sides of Equation (6) are independent, therefore,
H ( y t ) = H ( F v t + L v t τ ) H ( F v t )
Just like the MVC method, the equal sign in Equation (7) holds only when L = 0 , then the output entropy reaches the minimum, that is the benchmark entropy,
H min ( y t ) = H ( F v t )
The CPA based on MEC is to compare the actual system output entropy H ( y t ) with the output entropy H min ( y t ) under MEC. To sum up, the CPA index based on MEC can be expressed as,
η = H min ( y t ) H ( y t ) = H ( F v t ) H ( y t )
This indicator is similar to MVC, which is always between 0 and 1. Generally, the closer it is to 1, the closer the system to the ideal case, indicating that the system performs better; conversely, the closer it is to 0, the worse the system performance is, even including unstable control.

2.3. Prevenient Entropy Index and Generalized Correntropy

Entropy is a measure to describe the uncertainty of random variables, even for random variables without mean or variance. The size of entropy is only determined by the shape of the distribution rather than its location. There are many ways to describe entropy, such as Shannon entropy (SE), rational entropy (RE), Renyi entropy and delta entropy. In the researches based on the minimum entropy criterion, SE is widely used. For linear Gaussian systems, SE is equivalent to variance. For a continuous random variable x , Shannon entropy can be expressed as,
H S E = γ ( x ) ln γ ( x ) d x , x R
where γ ( x ) is its probability density function (PDF). SE is one of the CPA criteria based on MEC system in previous researches [10]. However, it does not meet the “consistency” principle, that is, the results must be the same when entropy could be calculated in many different ways. These characteristics show that SE cannot be used as a new evaluation index.
Fortunately, the rational entropy (RE) criterion is proposed by Zhou [10], which has most of the properties of Shannon entropy and satisfies the “consistency” principle. The definition of rational entropy is,
H R E = γ ( x ) log γ ( x ) 1 + γ ( x ) d x , x R
where R is the domain of random variable x , and γ ( x ) is its PDF. RE is a suitable benchmark for CPA of linear non-Gaussian systems. By comparing the output rational entropy under MEC process with actual, a CPA index will be obtained.
Of course, RE is not perfect. The PDF of variable x is essential to calculate the output RE. If the PDF is known, it is more convenient to use the RE criterion to calculate the CPA index. However, when the PDF of the variable is unknown, this method is very complicated in the actual calculation process. For instance, when the system parameters are estimated by the EDA, RE as the fitness value, needs to be calculated for several times in each iteration (usually 1000 s), and the operation will be repeated for many times in the identification process. In the actual systems, the parameters and data will be more complex, this method will waste a lot of time. Therefore, this method lacks practical engineering significance.
The mixture correntropy index is adopted by Zhang [13] to solve the problem of translation invariance. However, the Gaussian kernel is applied to the kernel function of mixture correntropy in [13], which is not comprehensive in describing the statistical properties of non-Gaussian random variables [16]. As mentioned before, generalized correntropy is a more appropriate indicator.
Correntropy is a measure of local similarity, which is directly related to the similarity of two random variables in the joint space neighborhood controlled by kernel width. Given two random variables X and Y , their correntropy is defined as [16],
V σ ( X , Y ) = E [ κ σ ( X , Y ) ] = κ σ ( x , y ) d F X Y ( x , y )
where E [ ] is the expectation value, κ σ is the kernel function with the width of σ ( σ > 0 ) and F XY ( x ,   y ) denotes the joint distribution function of ( X ,   Y ) . Generally, Gaussian kernel is chosen as the kernel function of correntropy,
κ σ ( x , y ) = G σ ( e ) = 1 2 π σ exp ( e 2 2 σ 2 ) = 1 2 π σ exp ( λ e 2 )
where e = x y and λ = 1 / 2 σ 2 is the kernel parameter. In this paper, a well-known GGD function will be used as the kernel function in correntropy [23],
G α , β ( e ) = 1 2 β Γ ( 1 / α ) exp ( | e β | α ) = γ α , β exp ( λ | e | α )
where Γ ( ) means the gamma function, α > 0 represents shape parameter; β > 0 is bandwidth parameter, λ = 1 /   β   α is the kernel parameter and γ α ,   β = α / ( 2 β Γ ( 1 / a ) ) is the normalization constant.
The GGD function is used as the kernel function in the correntropy,
V α , β ( X , Y ) = E [ G α , β ( E X Y ) ] = E [ G α , β ( X Y ) ]
where, to distinguish the correntropy with Gaussian kernel, we define it as generalized correntropy (GC). Clearly, it is an extension of correntropy.
If the joint distribution f   ( x ,   y ) of X and Y is known, the correntropy could be calculated by the following formula,
V α , β ( X , Y ) = + + G α , β ( x y ) f ( x , y ) d x d y
Generally, the joint distribution of X and Y is hard to known. Similar to correntropy, generalized correntropy is also calculated by sample estimation,
V N , α , β ( X , Y ) = 1 N i = 1 N G α , β ( x ( i ) y ( i ) ) = 1 N i = 1 N G α , β ( e ( i ) )
Next, some properties of generalized correntropy are summarized as follows to explain the internal conditions for it to be used as a performance index:
Property 1: V α ,   β ( X ,   Y ) is symmetric, that is V α ,   β ( X ,   Y ) = V α ,   β ( X ,   Y ) ;
Property 2:   V α ,   β   ( X ,   Y ) is positive and bounded: 0 < V N ,   α ,   β ( X ,   Y )     G α ,   β ( 0 ) = γ α ,   β , which reaches the maximum value if and only if X = Y ;
Property 3: The generalized correntropy contains the absolute higher moments of error variables, E X Y = X Y , V α , β ( X , Y ) = γ α , β n = 0 ( λ ) n n ! E [ | X Y | α n ] .
These characteristics show that generalized correntropy can be applied to CPA. When the expected output of the given system is known, the target PDF is expected to be tracked by the controlled output PDF as much as possible, that is, the expected output R k and the actual output Y k are as consistent as possible, e k = R k     Y k is close to 0 infinitely. Ideally, the target PDF is tracked by the output completely, the performance of the control system reaches the optimum, and the generalized correntropy reaches the maximum value γ α ,   β . Therefore, the new CPA index based on generalized correntropy can be defined as,
η G C   = V α β ( R k Y k ) γ α , β = E   [ G α ,   β ( e k ) ] γ α , β
where: E   [ G α ,   β ( e k ) ] is the generalized correntropy of e k = R k     Y k . Obviously, 0     η G C     1 , closer the index is to 1, the better the control system performance is; otherwise, the system performance is poor and needs to be improved.
When the expected output of the system is unknown, it is essential to identify the associated systems. In this case, generalized correntropy could be combined with the EDA to generate a new identification algorithm.
In data analysis of regression and classification, a measure called the correntropic loss (C-loss) is always used instead of correntropy [24]. The GCL between X and Y could be defined as,
J G C l o s s ( X , Y ) = G α , β ( 0 ) V α , β ( X , Y ) = γ α , β V α , β ( X , Y )
Obviously, minimizing GCL corresponds to maximizing generalized correntropy.
Assuming that { ( x i , y i ) } i = 1 N is a sample taken from p X Y , the estimated value of GCL is,
J G C l o s s ( X , Y ) = G α , β ( 0 ) V α , β ( X , Y ) = γ α , β 1 N i = 1 N G α , β ( x i y i )   = γ α , β 1 N i = 1 N G α , β ( e i )
GCL could be applied as the fitness value of the H-EDA to search for the optimal parameters and estimate the noise PDF of the given system; the specific algorithm will be given in Section 3.

3. Improvement and Application of EDA in CPA

In essence, the problem of system parameter identification is an alternative problem of high-dimensional parameter space optimization, so the system parameters can be obtained by the methods of high-dimensional parameters optimization. To obtain the noise PDFs, we have to know the order, time delay and the parameters of the given system.
For the estimation of delay, the simplest and most commonly used method is to analyze the correlation between the input u t and output y t signals [25], which can be expressed as,
τ ^ d   =   m a x τ E [ y ( τ ) u ( τ τ d ) ]
When τ = τ d , R y ,   u ( τ ) reaches the peak, the corresponding time is the estimated value of delay.
The Akaike information criterion [26] is applied to get the order of the system in this paper. For the given CARMA model, AIC criterion is as follows, n c is the order of the noise model,
A I C ( n ) = L lg σ e 2 + a ( n a + n b + n c )

3.1. EDA and Improved

Estimation of distribution algorithm (EDA) is a randomized search heuristic algorithm, which is used to create a probabilistic model of solution space. This model is updated iteratively based on the quality of the solutions sampled by the model [27]. In recent years, EDA as an optimization method has received great attention, and applied to scheduling, project, machine learning and identification problems successfully [27,28,29,30]. Wang [28] proposed a hybrid algorithm named estimation of particle swarm distribution algorithm (EPSDA), which combines PSO (the local search method) with the EDA (the global search method) to improve the efficiency of solving nonlinear bilevel programming problems (NBLP). To overcome the instability of model updating in the iterative process, a novel EDA based on the classic compact genetic algorithm (cGA) is proposed by [27] to optimize the benchmark function ONEMAX. Liang [29] developed a new variant of Gaussian EDA (GEDA) to solve the problem of premature convergence in traditional GEDA. In the process of CPA, EDA is mainly devoted to estimate the parameters of the given systems. Through appropriate improvements, the given system can be identified faster and more accurately, so as to obtain a more accurate benchmark.

3.2. Parameter Identification Based on Hybrid-EDA

As the previous work shows, the traditional EDA still has some shortcomings in the optimization process, such as a large amount of calculation, poor accuracy, etc. Therefore, based on the traditional EDA, the space for parameter identification is first obtained through preliminary estimation. To improve the local convergence ability, incorporating WSA [31] ideas into the EDA. In the iterative process, the centers of some excellent individuals are selected to cross-operate with the best individuals, the information of outstanding individuals could be retained more fully.

3.2.1. Acquisition of Parameter Identification Space

The initialization of parameter space is of positive significance to improve the identification accuracy and shorten the time required, so it is necessary to estimate the model parameters preliminarily. The recursive extended least squares (RELS) algorithm is selected in this paper.
The above CARMA model can be expressed as,
y ( k ) = φ T ( k ) θ + v ( k )
where,
{ φ ( k ) = [ y ( k 1 ) , , y ( k n a ) , u ( k τ ) , , u ( k τ n b ) , v ( k 1 ) , , v ( k n c ) ] T θ = [ a 1 , , a n a , b 0 , , b n b , c 1 , , c n c ] T R ( n a + n b + 1 + n c ) × 1
Since v ( k ) is unmeasurable, it can only be replaced by its estimated value v ^ ( k ) ,
v ^ ( k ) = y ( k ) y ^ ( k ) = y ( k ) φ ^ T ( k ) θ ^
where,
{ φ ^ ( k ) = [ y ( k 1 ) , , y ( k n a ) , u ( k τ ) , , u ( k τ n b ) , v ^ ( k 1 ) , , v ^ ( k n c ) ] T θ ^ = [ a ^ 1 , , a ^ n a , b ^ 0 , , b ^ n b , c ^ 1 , , c ^ n c ] T R ( n a + n b + 1 + n c ) × 1
The purpose is to obtain the parameter vector θ ^ when the objective function J ( θ ^ ) reaches the minimum value. J ( θ ^ ) is defined as follows,
J ( θ ^ ) = k = 1 L v 2 ( k ) = k = 1 L [ y ( k ) φ T ( k ) θ ^ ] 2
Based on the estimation of Gaussian system parameters, the initial estimates of variance σ ^ v and parameters θ ^ could be obtained, then the parameter identification space Ω W could be taken as θ ^ ± 3 σ ^ v .

3.2.2. The Idea of Wading Across Stream Algorithm

Wading across stream algorithm (WSA) [31] is inspired by the idea of “crossing the river with stones”. This algorithm examines the “shore” carefully to select an initial starting point firstly, several solutions are randomly searched out in the neighborhood near the starting point to get the best one as the iterative result. Then take this result as the starting point and continue search for several solutions in the nearby neighborhood, select the optimal solution as the third iterations result, and so on until the termination condition is satisfied. The WSA is similar to the simulated annealing algorithm but the effect is better, and the algorithm idea is simple with good local convergence ability, so we integrate this algorithm idea into the EDA to improve its computational efficiency.
(1)
Choose the initial starting point carefully
When the parameter identification space is known, the initially selected solution will have a greater impact on the entire algorithm. Firstly, generate R solutions in the parameter space with uniform random values, and select the optimal solution as the initial. For the parameter vector θ ^ = [ a ^ 1 , , a ^ n a , b ^ 0 , , b ^ n b , c ^ 1 , , c ^ n c ] to be identified, it is necessary to uniformly and randomly take values in the interval [ θ ^ 3 σ ^ v , θ ^ + 3 σ ^ v ] to generate R solutions (vectors) to form a parameter space A ,
A = [ θ ^ 1 ; θ ^ 2 ; ; θ ^ R ] = [ a ^ 11 a ^ 1 n a b ^ 10 b ^ 1 n b c ^ 11 c ^ 1 n c a ^ 21 a ^ 2 n a b ^ 20 b ^ 2 n b c ^ 21 c ^ 2 n c a ^ R 1 a ^ R n a b ^ R 0 b ^ R n b c ^ R 1 c ^ R n c ]
To calculate the fitness value of these R solutions, select the optimal individual θ ^ * as the initial starting point according to the fitness value. For ease of description, denote the starting point is as θ ^ * = [ x 1 * ,   x 2 * , , x n *   ] , where n = n a + n b + n c + 1 .
(2)
Search strategy
Due to the complexity of parameter identification in high-dimensional space, the design of an effective local search strategy could improve the quality of the solution significantly. According to the idea of “crossing the river with stones”, when you touch a “stone”, you must take that “stone” as the starting point to explore other “stones” around it. Similarly, search for m individuals in the neighborhood radius L k around the starting point θ ^ * to form a new parameters space B ,
B = [ θ ^ 1 ; θ ^ 2 ; ; θ ^ m ] = [ a ^ 11 a ^ 1 n a b ^ 10 b ^ 1 n b c ^ 11 c ^ 1 n c a ^ 21 a ^ 2 n a b ^ 20 b ^ 2 n b c ^ 21 c ^ 2 n c a ^ m 1 a ^ m n a b ^ m 0 b ^ m n b c ^ m 1 c ^ m n c ] = [ x 11 x 12 x 1 n x 21 x 22 x 2 n x m 1 x m 2 x m n ]
where x i j = x j *   + L k r i j , r i j represents uniformly distributed random numbers in the interval [–1, 1]. Generally, the initial value L 0 of neighborhood radius is taken as one tenth of the whole parameters space, that is L 0 = ( 0.1 ~ 0.3 ) σ ^ v . To accelerate the later convergence of the algorithm, reduce the neighborhood radius gradually with the iterative process. The simplest method is to set L k = α L k 1 , α is the shrinkage coefficient, generally [0.90, 0.99]. The latest optimal individual θ ^ * can be obtained by calculating the fitness values of these m solutions.

3.2.3. Crossover Operation

The optimal individual obtained by traditional identification algorithm only contains the information of one individual, which ignores the information contained in other excellent individuals to a certain extent. The improved method is crossover operation, sort the above m individuals according to the fitness value, select the top N * ( N * < m ) excellent solutions (individuals with the best fitness value) D,
D = [ θ ^ 1 ; θ ^ 2 ; ; θ ^ N * ]   = [ a ^ 11 a ^ 1 n a b ^ 10 b ^ 1 n b c ^ 11 c ^ 1 n c a ^ 21 a ^ 2 n a b ^ 20 b ^ 2 n b c ^ 21 c ^ 2 n c a ^ N * 1 a ^ N * n a b ^ N * 0 b ^ N * n b c ^ N * 1 c ^ N * n c ]   = [ x 11 x 12 x 1 n x 21 x 22 x 2 n x N * 1 x N * 2 x N * n ]
Calculate the center point of D,
{ θ ¯ = [ x ¯ 1 , x ¯ 2 , , x ¯ n ] T x ¯ i = 1 N * j = 1 N x j i
Then, the center point is crossed with the best solution, and the optimal individual θ ^ * is modified as follows,
θ ^ = a θ ^ + ( 1 a ) θ ¯
where: a is a random number between 0 to 1.
Through the crossover operation, the information of excellent individuals could be retained to the maximum extent, which can avoid falling into the local optimum.

3.2.4. Selection of Fitness Value

From Equation (23), for the CARMA model in this paper, the error at time k could be defined as,
e ( k ) = y ( k ) y ^ ( k ) = y ( k ) φ ^ T ( k ) θ ^
For the estimated error sequence e = [ e 1 , e 2 , e L ] , the GCL could be defined as,
J G C l o s s ( e ) = γ α ,   β 1 N i = 1 N G α , β ( e )
The optimal model parameters and the relationship between output and disturbance could be obtained by minimizing GCL. When the GCL reaches the minimum, the corresponding parameters are also optimal, namely,
θ ^ o p t = arg min θ Ω W J G C ( e k )
where: Ω W is the parameter identification space.

3.2.5. Algorithm Summary

The following Algorithm 1 is the program steps of the H-EDA identification algorithm.
Algorithm 1 Program steps for the identification algorithm.
1. Describe the system by CARMA model; estimate the system delay τ by analyzing the correlation between u(t) and y(t); determine the model order ( n a , n b ,   n c ).
2. Preliminary estimate. Rough model parameters are obtained by RELS, determine the identification space Ω W as θ ^ ± 3 σ ^ v .
3. Choose the initial starting point carefully. Randomly generate R individuals from the parameter space Ω W   , A = [ θ ^ 1 ;   θ ^ 2 ; ;   θ ^ R ] , calculate the fitness value (error entropy) of these solutions in A and select the optimal one θ ^ 0 *
according to the fitness value (corresponding error entropy is minimum).
4. Search for m individuals B 0 = [ θ ^ 1 ;   θ ^ 2 ; ;   θ ^ m ] within the neighborhood radius L 0 of θ ^ 0 * ;   I = 1 ;
  While I <= nmax
(1) Calculate the fitness value (error entropy) of m individuals and sorted C I = [ θ ^ 1 ( I ) * ;   θ ^ 2 ( I ) *   ; ;   θ ^ m ( I ) * ] based on the fitness value. Extract the optimal one θ ^ * I and top N * individuals D I = [ θ ^ 1 ( I ) * ;   θ ^ 2 ( I ) *   ; ;   θ ^ N * ( I ) * ] from C I ;
(2) Modify the optimal individual θ ^ * I . Calculate the center point (mean value) θ ¯ = [ x ¯ 1 ,   x ¯ 2 , , x ¯ n ] T of D I . Modify the optimal individual as, θ ^ * I = a θ ^ * I + ( 1 a ) θ ¯ ,   a = r a n d
(3) If the termination conditions is met
 The difference between two adjacent iterations error entropy is less than 0.0001, end the cycle.
     Else
L I = α L I 1 ;
 Search for m individuals B I = [ θ ^ 1 ( I ) ;   θ ^ 2 ( I ) ; ;   θ ^ m ( I ) ] within the neighborhood radius L I of θ ^ * I ;
I = I + 1 ;
     End
  End
5. Get the estimation of parameters θ ^ and noise PDF v ^ .
After the coefficient F ^ and noise v ^ t of the feedback-invariant are obtained, the benchmark entropy and output entropy could be obtained by Equations (8) and (11), and the CPA index could be obtained by Equation (9),
η ^ = H ( y t m j c ) H ( y t ) = H min ( F ^ v ^ t ) H ( y t )

3.3. Algorithm Validation and Sensitivity Analysis of Initial Parameters

In this section, a model example is given to illustrate the initial parameters of improved algorithm, which further analyses the H-EDA and traditional EDA, consider the following system,
y ( k ) = 1 + 1.5 z 1 + 0.9 z 2 1 1.7 z 1 + 0.7 z 2 v ( k )
For the above example, the parameter vector to be identified is θ = [ 1 . 7 ,   0 . 7 ,   1 . 5 ,   0 . 9 ] . Most of the initial parameters are based on the previous works [12,13,14,31,32,33]. Generally, in the EDA, the number of individuals in the initial population is N = 1000 ; the optimal number of individuals for each iteration is m = 200 ; and the maximum number of iterations is n m a x = 120 . In WSA, the number of excellent individuals for crossover operation is N * = 30 , R = 800 .
In this section, we focus on the sensitivity of shrinkage coefficient α and the initial value of neighborhood radius L 0 . As mentioned before, the initial neighborhood radius is generally L 0 = ( 0.1 ~ 0.3 ) σ ^ v , and the shrinkage coefficient α is between 0.90 to 0.99. To find a proper set of parameters as much as possible, we have carried out a large number of experiments and some representative results are selected and attached in Table 1. In addition, we analyze the significance of a to the algorithm, a is a random number between [0, 1]. If a = = 1 , it means that the crossover operation is not carried out. A novel parameter vector is defined as P = [ L 0 , α , a ] .
The test results show that the H-EDA is obviously superior to the traditional EDA. The global search capability of the algorithm is mainly determined by the neighborhood radius L k . Too small a value of L k prompts the algorithm to get trapped in local optima, while too large of a value of L k slows down the algorithm; α can accelerate the later convergence of the algorithm, but too small of an α leads to rapid convergence and bad accuracy; the main function of a is to reduce iterations and prevent local optimality. In short, the setting of initial parameters must rely on a large number of experiments and the summary of previous works. For this test case, the optimal initial parameter vector can be selected as P = [ 0.15 σ ^ v , 0.95 , r a n d ] .

4. Simulation and Verification

To verify the effectiveness of the identification algorithm and the rationality of the performance assessment index, the following system is considered and the numerical simulation is carried out with Gaussian and non-Gaussian noise signals.
y ( t ) = u ( t 2 ) + 1 0.2 z 1 1 z 1 v ( t )
The transfer function of the controller is as follows,
G c = K 1 0.2 z 1 0.8 z 2
It is easy to know that the parameter vector of the system is θ = [ 1 , 1 , 1 , 0.2 ] , system delay τ = 2 , feedback-invariants F = [ 1 , 0.8 ] , and the controller gain in the simulation is K = 1.2 .
For more precise description, the follow-up content is divided into the following parts: in Section 4.1, the effect of the improved algorithm and the improved benchmark entropy on the performance evaluation are explained respectively when the expected output is unknown, and their effectiveness are verified respectively. In Section 4.2, the tracking index is used to evaluate the system performance when the expected output of the system is known.

4.1. Simulation When Expected Output Distribution Is Unknown

It can be concluded from Section 3 that the system parameter identification and noise PDF estimation should be carried out firstly when the expected output is unknown, then the benchmark value could be obtained according to the feedback invariant coefficient estimation F ^ and noise estimation v ^ , and the performance evaluation index can be obtained finally. The specific process is shown in Figure 2.
According to the analysis ideas in Section 3.3, the parameters of the algorithms in this section are set as follows:
In the EDA, the number of individuals in the initial population is N = 1000 , the optimal number of individuals for each iteration is m = 200 , and the maximum number of iterations is n max = 120 . Correspondingly, the parameters of H-EDA are set as, N * = 30 , L 0 = 0.15 σ ^ v , α = 0.95 , R = 800 , the rests are consistent with the EDA. As the comparison algorithm, the parameters of the improved PSO (IPSO), particles N = 50 , acceleration factor c 1 = 2 , c 2 = 2 ,   V m a x = 0 . 5 .
The termination condition is | J ( l ) J ( l 1 ) | 0.0001 or reaches the specified number of iterations n max , J ( l ) is the l -generation GCL.
The noises distribution are subject to normal N ( 0 , 0.255 ) , Beta B ( 2 , 9 ) , exponential E ( 0.5 ) and bimodal,
v f ( x ) = a r σ 2 π 1 e ( x μ 1 ) 2 2 σ 1 2 + b 1 r σ 2 π 2 e ( x μ 2 ) 2 2 σ 2 2
where μ 1 = 3 , μ 2 = 3 , σ 1 = 1 , σ 2   = 0 . 4 , r = 0.7 .
(1)
When the fitness value is fixed as the GCL, compare the H-EDA with the traditional EDA and the IPSO.
Combine the above three algorithms with GCL, respectively. The probability distributions of the four kinds of noise are shown in Figure 3. It can be determined that the other three kinds of noise are non-Gaussian except for normal distribution. The results of the system parameter identification are shown in Table 2.
The results in Table 2 show that the H-EDA could estimate the parameters of CARMA model under different noises when the fitness value is determined, and the identification accuracy and speed are better. Moreover, the fitness value of the algorithm is the generalized correntropy loss function, which proves the effectiveness of the novel criterion in the identification process. It is evident that the noises distribution estimated by the H-EDA in Figure 4 and Figure 5 are very close to the real values. IPSO, as the comparison, is inferior to the H-EDA and cannot estimate the noise of bimodal distribution. Therefore, the superiority of the H-EDA is proved by the simulation results.
According to the noise PDF v ^ t and the feedback invariant coefficient F ^ , we can get the performance assessment indexes, as shown in Table 3, where η M E represents the theoretical benchmark index; η ^ M J - E D A is the index obtained by identification results of the traditional EDA based on GCL; η ^ M J - H - E D A corresponds to the H-EDA index. Obviously, η ^ M J - H - E D A is closer to the theoretical value, which proves the effectiveness of the hybrid-algorithm again.
(2)
When the identification algorithm is fixed as the H-EDA, compare the traditional correntropy (TC) with the generalized correntropy criterion (GCL).
In the previous section, we have proved the effectiveness of generalized correntropy criterion in the performance assessment process. Next, we compare the superiority of GCL with TC based on the H-EDA. The parameters of algorithms, identified objects and noise PDFs are consistent with (1). The simulation results are illustrated in Table 4 and Figure 6.
The results show that when the disturbance obeys Gaussian, non-Gaussian and bimodal distribution, the identification results based on GCL are more accurate, and the time required to obtain the evaluation index is shorter.

4.2. Simulation When the Expected Output Distribution Is Known

Assumed that the known expected output distribution is Gaussian with mean value of 0 and variance of 1, as described in Section 2, the output PDF is expected to track the target PDF as much as possible.
The system under Gaussian and non-Gaussian noise disturbance are simulated respectively when the expected output distribution of the system is known. The output PDF tracks the input PDF as shown in Figure 7. The noise in (a) obeys normal distribution, while the noise in (b) obeys the non-Gaussian distribution. The generalized correntropy performance assessment index could be calculated by Formula (18) and shown in Table 5.
In Table 5, η G C is the generalized correntropy index and η C is the traditional correntropy index. Assessment results of generalized correntropy are consistent with the traditional correntropy index when the controller gain K is different, which also proves the effectiveness of the new index.

5. Conclusions

A new CPA method for linear non-Gaussian systems based on generalized correntropy is proposed by analyzing the limitations of existing performance evaluation methods and indicators. The simple tracking index based on generalized correntropy is given directly when the expected output is known. When the expected output is unknown, the generalized correntropy loss function is used as the fitness value of the H-EDA to estimate the system parameters and noise distribution, so as to calculate the system performance benchmark more quickly and accurately, and then the performance assessment results can be obtained. The single variable method is used to simulate the SISO control system under different noise disturbances, and the effectiveness of the new index and H-EDA is verified respectively. The results show that generalized correntropy index is superior to the traditional correntropy in evaluating the control loop performance of linear systems with non-Gaussian stochastic disturbances, and the H-EDA could also identify the system parameters more quickly and accurately. Then, the follow-up research still focuses on the selection of the new performance assessment index and the improvement of identification algorithm.

Author Contributions

J.Z. and G.H. conceived and designed the experiments; G.H. performed the experiments; L.Z. and G.H. analyzed the data and wrote the paper. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by the National Youth Science Foundation (Grant No. 61603136).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Jelali, M. Control. Performance Management in Chemical Automation; Springer: London, UK, 2013. [Google Scholar]
  2. Zhang, J.H.; Ren, M.F.; Tian, Y. Constrained stochastic distribution control for nonlinear stochastic systems with non-Gaussian noises. Int. J. Innov. Comput. Inf. Control. 2013, 9, 1759–1768. [Google Scholar]
  3. Harris, T.J. Assessment of Closed Loop Performance. Can. J. Chem. Eng. 1989, 67, 856–861. [Google Scholar] [CrossRef]
  4. Huang, B.; Shah, S.L. Performance Assessment of Nonminimum Phase Systems. In Performance Assessment of Control Loops; Springer: London, UK, 1999. [Google Scholar]
  5. Huang, B. Minimum variance control and performance assessment of time-variant processes. J. Process. Control. 2002, 12, 707–719. [Google Scholar] [CrossRef]
  6. Chen, J.; Wang, W.Y. Performance assessment of multivariable control systems using PCA control charts. In Proceedings of the IEEE Conference on Industrial Electronics and Applications (ICIEA 2009), Xi’an, China, 25–27 May 2009. [Google Scholar]
  7. Guo, L.; Wang, H. Stochastic Distribution Control. System Design; Springer: London, UK, 2010. [Google Scholar]
  8. Zhang, J.; Chu, C.C.; Munoz, J. Minimum entropy based on run-to-run control for semiconductor processes with uncertain metrology delay. J. Process. Control. 2009, 19, 1688–1697. [Google Scholar] [CrossRef]
  9. Zhang, J.; Jiang, M.; Chen, J. Minimum entropy-based performance assessment of feedback control loops subjected to non-Gaussian disturbances. J. Process. Control. 2014, 24, 1660–1670. [Google Scholar] [CrossRef]
  10. Zhou, J.L.; Wang, X.; Zhang, J.F. A New Measure of Uncertainty and the Control Loop Performance Assessment for Output Stochastic Distribution Systems. IEEE Trans. Autom. Control. 2015, 60, 2524–2529. [Google Scholar] [CrossRef]
  11. Zhou, J.; Jia, Y.; Jiang, H.; Fan, S. Non-Gaussian Systems Control Performance Assessment Based on Rational Entropy. Entropy 2018, 20, 331. [Google Scholar] [CrossRef] [Green Version]
  12. Zhang, Q.; Wang, Y. Performance Assessment of Cascade Control System with Non-Gaussian Disturbance Based on Minimum Entropy. Symmetry 2019, 11, 379. [Google Scholar] [CrossRef] [Green Version]
  13. Zhang, J.; Wu, D. Performance Assessment of Non-Gaussian Control Systems Based on Mixture Correntropy. Entropy 2019, 21, 1069. [Google Scholar] [CrossRef] [Green Version]
  14. Zhang, Q.; Wang, Y. Improved Renyi Entropy Benchmark for Performance Assessment of Common Cascade Control System. IEEE Access 2019, 7, 6796–6803. [Google Scholar] [CrossRef]
  15. Chen, B.; Wang, X.; Lu, N. Mixture Correntropy for Robust Learning. Pattern Recognit. 2018, 79, 318–327. [Google Scholar] [CrossRef]
  16. Chen, B.; Xing, L.; Zhao, H. Generalized Correntropy for Robust Adaptive Filtering. IEEE Trans. Signal. Process. 2016, 64, 3376–3387. [Google Scholar] [CrossRef] [Green Version]
  17. Sun, Q.; Zhang, T.; Huang, X. Robust Multi-kernel Generalized Maximum Correntropy Filters. In Proceedings of the 2020 IEEE 3rd International Conference on Electronic Information and Communication Technology (ICEICT), Shenzhen, China, 13–15 November 2020; pp. 99–103. [Google Scholar]
  18. Wang, W.; Wang, S. Generalized Correntropy Sparse Gauss-Hermite Quadrature Filter for Epidemic Tracking on Complex Networks. IEEE Trans. Syst. Man Cybern. Syst. 2021. [Google Scholar] [CrossRef]
  19. Ma, W.; Qiu, J.; Liu, X. Unscented Kalman Filter with Generalized Correntropy Loss for Robust Power System Forecasting-Aided State Estimation. IEEE Trans. Ind. Inform. 2019, 15, 6091–6100. [Google Scholar] [CrossRef]
  20. Ren, M.; Gong, M.; Lin, M. Generalized Correntropy Predictive Control for Waste Heat Recovery Systems Based on Organic Rankine Cycle. IEEE Access 2019, 7, 151587–151594. [Google Scholar] [CrossRef]
  21. Chen, B.; Zhu, Y.; Hu, J. System Parameter Identification: Information Criteria and Algorithms; Newnes: Boston, MA, USA, 2013. [Google Scholar]
  22. Yu, J.; Zhou, J.; Li, D. Performance assessment of control loops with non-Gaussian disturbance Based on Generalized Minimum Entropy. In Proceedings of the International Conference on Industrial Artificial Intelligence (IAI), Shenyang, China, 23–27 July 2019. [Google Scholar]
  23. Chen, B.; Principe, J.C. Maximum correntropy estimation is a smoothed MAP estimation. IEEE Signal. Process. Lett. 2012, 19, 491–494. [Google Scholar] [CrossRef]
  24. Syed, M.N.; Pardalos, P.M.; Principe, J.C. On the optimization properties of the correntropic loss function in data analysis. Optim. Lett. 2014, 8, 823–839. [Google Scholar] [CrossRef]
  25. Bjorklund, S. A Survey and Comparison of Time-Delay Estimation Methods in Linear Systems; Lund Institute of Technology: Lund, Sweden, 2003. [Google Scholar]
  26. Chan, Y.T.; Wood, J. A new order determination technique for ARMA processes. IEEE Trans. Acoust. Speech Signal. Process. 1984, 32, 517–521. [Google Scholar] [CrossRef]
  27. Doerr, B.; Krejca, M.S. Significance-Based Estimation-of-Distribution Algorithms. IEEE Trans. Evol. Comput. 2020, 24, 1025–1034. [Google Scholar] [CrossRef] [Green Version]
  28. Wang, G.; Ma, L. The Estimation of Particle Swarm Distribution Algorithm with Sensitivity Analysis for Solving Nonlinear Bilevel Programming Problems. IEEE Access 2020, 8, 137133–137149. [Google Scholar] [CrossRef]
  29. Liang, Y.; Ren, Z.; Yao, X. Enhancing Gaussian Estimation of Distribution Algorithm by Exploiting Evolution Direction with Archive. IEEE Trans. Cybern. 2020, 50, 140–152. [Google Scholar] [CrossRef] [PubMed]
  30. Chi, Y.; Yao, Y.; Hu, L. Study of Via Optimization Problem Based on Estimation of Distribution Algorithm. In Proceedings of the International Symposium on Computational Intelligence and Design (ISCID), Hangzhou, China, 14–15 December 2019; pp. 84–88. [Google Scholar]
  31. Gao, S.; Qiu, L.; Cao, C.G. Improved Estimation of Distribution Algorithm for Multi-objective Optimization Problems. J. Nanjing Norm. Univ. (Nat. Sci. Ed.) 2015, 38, 108–112. [Google Scholar]
  32. Zhou, J.; Zhang, J.; Zhu, H. Non-Gaussian system identification based on improved estimation of distribution algorithm. In Proceedings of the 2017 36th Chinese Control Conference (CCC), Dalian, China, 26–28 July 2017; pp. 2052–2057. [Google Scholar]
  33. Zhang, H.; Zhou, J.; Wang, J. Performance Assessment of Non-Gaussian Systems Based on Double Error Entropy Minimization. In Proceedings of the 2019 IEEE 8th Data Driven Control and Learning Systems Conference (DDCLS), Dali, China, 24–27 May 2019; pp. 1177–1182. [Google Scholar]
Figure 1. SISO feedback control loop system.
Figure 1. SISO feedback control loop system.
Entropy 23 00764 g001
Figure 2. Flow chart of performance assessment.
Figure 2. Flow chart of performance assessment.
Entropy 23 00764 g002
Figure 3. Probability distribution of different noises.
Figure 3. Probability distribution of different noises.
Entropy 23 00764 g003
Figure 4. Actual and estimated probability distribution histograms of different noises.
Figure 4. Actual and estimated probability distribution histograms of different noises.
Entropy 23 00764 g004
Figure 5. Actual and estimated PDF of different noises.
Figure 5. Actual and estimated PDF of different noises.
Entropy 23 00764 g005aEntropy 23 00764 g005b
Figure 6. Actual and estimated PDF with different noises.
Figure 6. Actual and estimated PDF with different noises.
Entropy 23 00764 g006aEntropy 23 00764 g006b
Figure 7. Expected and actual output PDF under different noises.
Figure 7. Expected and actual output PDF under different noises.
Entropy 23 00764 g007
Table 1. Model identification results based on the parameters sensitivity analysis (partial).
Table 1. Model identification results based on the parameters sensitivity analysis (partial).
ProjectParameter ConfigurationIdentification ResultsIterations
EDA N = 1000 ,   m = 200 [−1.4922, 0.5527, 1.6509, 0.6267]120
H-EDA L 0 = 0.1 σ ^ v α = 0.93 a = r a n d [−1.7209, 0.7241, 1.4680, 0.8668]75
a = = 1 [−1.7398, 0.7529, 1.4099, 0.7911]105
α = 0.95 a = r a n d [−1.7309, 0.7227, 1.4219, 0.8113]82
a = = 1 [−1.6675, 0.6728, 1.4003, 0.7849]109
α = 0.97 a = r a n d [−1.7273, 0.7305, 1.4085, 0.8173]91
a = = 1 [−1.6631, 0.6645, 1.4204, 0.8054]120
L 0 = 0.15 σ ^ v α = 0.93 a = r a n d [−1.7197, 0.7218, 1.5160, 0.8851]76
a = = 1 [−1.7749, 0.7735, 1.4595, 0.8299]87
α = 0.95 a = r a n d [−1.6919, 0.6975, 1.5009, 0.9104]62
a = = 1 [−1.7181, 0.7184, 1.5090, 0.8544]79
α = 0.97 a = r a n d [−1.7304, 0.7310, 1.5164, 0.8964]95
a = = 1 [−1.6643, 0.6654, 1.4698, 0.8605]113
L 0 = 0.2 σ ^ v α = 0.93 a = r a n d [−1.7112, 0.7176, 1.3935, 0.8291] 71
a = = 1 [−1.7672, 0.7689, 1.1863, 0.5712]97
α = 0.95 a = r a n d [−1.6734, 0.6737, 1.5061, 0.9069]104
a = = 1 [−1.7462, 0.7504, 1.4771, 0.8579]117
α = 0.97 a = r a n d [−1.6823, 0.7446, 1.4500, 0.8403]109
a = = 1 [−1.7769, 0.7789, 1.4058, 0.8304]120
Table 2. Identification results of two algorithms under different noise distributions.
Table 2. Identification results of two algorithms under different noise distributions.
Actual ValueN (0, 0.255)B (2, 9)
EDAH-EDAEDAH-EDA
a 1 −1−0.9342−0.9873−1.0529−1.0027
b 0 10.91860.99681.08481.0003
b 1 −1−0.9337−0.9816−0.9466−0.9992
c 1 −0.2−0.0879−0.1991−0.2365−0.2014
F [1, 0.8][1, 0.7011][1, 0.7782][1, 0.8596][1, 0.8035]
Time(s) 29.951715.272435.449213.6584
E (0.5)Bimodal Noise
EDAH-EDAEDAH-EDA
a 1 −1−0.9158−0.9884−0.9173−1.0097
b 0 10.93890.99610.93771.0643
b 1 −1−0.9323−0.9924−0.8946−1.0853
c 1 −0.2−0.0865−0.1935−0.1556−0.2037
F [1, 0.8][1, 0.7595][1, 0.7857][1, 0.6987][1, 0.8138]
Time(s) 30.988514.261629.625514.7962
Table 3. CPA indexes of two comparison algorithms.
Table 3. CPA indexes of two comparison algorithms.
IndexN (0, 0.255)B (2, 9)E (0.5)Bimodal Noise
η M E 0.88090.93570.94080.9332
η ^ M J - E D A 0.83450.9016091340.9267
η ^ M J - H - E D A 0.87940.93880.93990.9336
Table 4. Estimated parameters and evaluation indexes under two different entropy benchmarks.
Table 4. Estimated parameters and evaluation indexes under two different entropy benchmarks.
Actual
Value
N (0, 0.255)B (2, 9)
Results of Traditional CorrentropyResults of Generalized CorrentropyResults of Traditional CorrentropyResults of Generalized Correntropy
a 1 −1−0.9312−0.9873−1.0148−1.0027
b 0 11.00610.99681.02541.0003
b 1 −1−1.1073−0.9816−1.1995−0.9992
c 1 −0.2−0.1635−0.1991−0.1837−0.2014
F [1, 0.8][1, 0.7149][1, 0.7782][1, 0.8434][1, 0.8035]
η ^ 0.85660.87940.89270.9388
Time(s) 18.944315.272415.166713.6584
E (0.5)Bimodal Noise
a 1 −1−0.9838−0.9884−0.9945−1.0097
b 0 10.92460.99611.08521.0643
b 1 −1−0.9697−0.9924−1.0521−1.0853
c 1 −0.2−0.2194−0.1935−0.1798−0.2037
F [1, 0.8][1, 0.7520][1, 0.7857][1, 0.8102][1, 0.8138]
η ^ 0.90710.93990.94620.9336
Time(s) 18.993614.261618.561514.7962
Table 5. Generalized correntropy performance index under expected output distribution is known.
Table 5. Generalized correntropy performance index under expected output distribution is known.
IndexGaussian NoiseNon-Gaussian Noise
K = 0.8 η C 0.84060.6554
η G C 0.82150.6021
K = 1.0 η C 0.86980.7077
η G C 0.85200.6641
K = 1.2 η C 0.88740.7447
η G C 0.81620.6429
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, J.; Huang, G.; Zhang, L. Generalized Correntropy Criterion-Based Performance Assessment for Non-Gaussian Stochastic Systems. Entropy 2021, 23, 764. https://0-doi-org.brum.beds.ac.uk/10.3390/e23060764

AMA Style

Zhang J, Huang G, Zhang L. Generalized Correntropy Criterion-Based Performance Assessment for Non-Gaussian Stochastic Systems. Entropy. 2021; 23(6):764. https://0-doi-org.brum.beds.ac.uk/10.3390/e23060764

Chicago/Turabian Style

Zhang, Jinfang, Guodou Huang, and Li Zhang. 2021. "Generalized Correntropy Criterion-Based Performance Assessment for Non-Gaussian Stochastic Systems" Entropy 23, no. 6: 764. https://0-doi-org.brum.beds.ac.uk/10.3390/e23060764

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop