Next Article in Journal
Entanglement and Photon Anti-Bunching in Coupled Non-Degenerate Parametric Oscillators
Next Article in Special Issue
Stochastic Order and Generalized Weighted Mean Invariance
Previous Article in Journal
3E-Net: Entropy-Based Elastic Ensemble of Deep Convolutional Neural Networks for Grading of Invasive Breast Carcinoma Histopathological Microscopic Images
Previous Article in Special Issue
Bounds on the Lifetime Expectations of Series Systems with IFR Component Lifetimes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Fractional Deng Entropy and Extropy and Some Applications

by
Mohammad Reza Kazemi
1,
Saeid Tahmasebi
2,
Francesco Buono
3 and
Maria Longobardi
4,*
1
Department of Statistics, Faculty of Science, Fasa University, Fasa 746-168-6688, Iran
2
Department of Statistics, Persian Gulf University, Bushehr 751-691-3817, Iran
3
Dipartimento di Matematica e Applicazioni “Renato Caccioppoli”, Università degli Studi di Napoli Federico II, 80138 Naples, Italy
4
Dipartimento di Biologia, Università degli Studi di Napoli Federico II, 80138 Naples, Italy
*
Author to whom correspondence should be addressed.
Submission received: 29 April 2021 / Revised: 12 May 2021 / Accepted: 14 May 2021 / Published: 17 May 2021
(This article belongs to the Special Issue Measures of Information)

Abstract

:
Deng entropy and extropy are two measures useful in the Dempster–Shafer evidence theory (DST) to study uncertainty, following the idea that extropy is the dual concept of entropy. In this paper, we present their fractional versions named fractional Deng entropy and extropy and compare them to other measures in the framework of DST. Here, we study the maximum for both of them and give several examples. Finally, we analyze a problem of classification in pattern recognition in order to highlight the importance of these new measures.

1. Introduction

The concept of entropy as a measure of uncertainty was first introduced by Shannon [1], and since then, it has been used in the field of information theory, image and signal processing and economics. Let X be a discrete random variable with probability mass function vector p ̲ = ( p 1 , , p n ) . The Shannon entropy of X is defined as follows
H ( X ) = H ( p ̲ ) = i = 1 n p i log p i ,
where log ( · ) stands for the natural logarithm with the convention 0 log 0 = 0 . Recently, the dual measure of entropy has become widespread. It is known as extropy and was defined for a discrete random variable X by Lad et al. [2] as
J ( X ) = J ( p ̲ ) = i = 1 n ( 1 p i ) log ( 1 p i ) ,
and since then, as the Shannon entropy, it has been studied in several contexts and in its differential version [3,4,5,6].
The generalization of Shannon entropy to various fields is always of great interest. Ubriaco [7] defined a new entropy based on fractional calculus as follows:
S q ( X ) = S q ( p ̲ ) = i = 1 n p i [ log p i ] q , 0 < q 1 .
The fractional entropy is concave, positive and non-additive. Moreover, for q = 1 , the fractional entropy reduces to the Shannon entropy. From a physical sense, it also satisfies Lesche and thermodynamic stability.
The purpose of this paper is to extend to the fractional case of Deng entropy and extropy. Deng entropy and extropy [8,9] are two measures of uncertainty known in the context of the Dempster–Shafer theory (DST) of evidence. The DST of evidence [10,11] is a generalization of the classical probability theory. In DST, an uncertain event with a finite number of alternatives is considered, and a mass function over the power set of the alternatives, considered as a degree of confidence, is defined. DST allows us to describe more general situations in which there is less specific information with respect to the classical probability theory. DST has several applications due to its advantages in dealing with uncertainty; for example, it is used in reliability analysis [12,13], in decision making [14,15], and so on [16,17].
Now, we describe an example given in [8] to explain how DST extends the classical probability theory. Consider two boxes, A and B, such that in A, there are only red balls, whereas in B, there are only green balls and the number of balls in each box is unknown. A ball is picked randomly from one of the boxes. The box A is chosen with probability p A = 0.6 and box B is selected with probability p B = 0.4 . Thus, the probability of picking up a red ball is 0.6, P ( R ) = 0.6 , and the probability of picking a green ball is 0.4, P ( G ) = 0.4 . Now, suppose in box B there are green and red balls with rates unknown and p A , p B are unchanged. In this case, we cannot obtain the probability of picking up a red ball. To overcome this problem, we can use DST to express the uncertainty. In particular, we choose a mass function m, such that m ( R ) = 0.6 and m ( R , G ) = 0.4 .
The rest of the paper is organized as follows. In Section 2, we recall the basic notions of the Dempster–Shafer theory of evidence and some of the most important measures of uncertainty in this context. In Section 3, we define and study the fractional Deng entropy. In Section 4, we introduce the fractional Deng extropy, and several examples are given. In Section 5, we apply fractional Deng entropy and fractional Deng extropy to a problem of classification. Finally, in Section 6, we give conclusions and summarize the results obtained in the paper.

2. Preliminaries

In this section, we review some basic definitions in the Dempster–Shafer evidence theory (DST) [10,11] and Deng entropy [8].
Definition 1.
Let X = { θ 1 , θ 2 , , θ i , , θ | X | } be a finite set of mutually exclusive and collectively exhaustive events, X is the frame of discernment (FOD). The power set of X consists of 2 X elements denoted as follows:
2 X = { , { θ 1 } , , { θ | X | } , { θ 1 , θ 2 } , , { θ 1 , θ 2 , , θ i } , , X } .
Definition 2.
(Mass function) Given a FOD X = { θ 1 , θ 2 , , θ i , , θ | X | } , a mapping m from 2 X to [ 0 , 1 ] is called a mass function, or basic probability assignment (BPA), formally defined by:
m : 2 X [ 0 , 1 ]
which satisfies
m ( ) = 0 , A 2 X m ( A ) = 1 , m ( A ) 0 .
In DST, m ( A ) represents how strongly the evidence supports A. Then, m ( A ) measures the belief exactly assigned to A. If m ( A ) > 0 , then A is called a focal element.
Recently, some operations on BPA are presented, such as negation [18] and correlation [19]. In several applications, we need to generate a new BPA starting from independent BPAs or from a weight of evidence represented by a coefficient α ( 0 , 1 ] .
In DST, there are different indices to evaluate the degree of belief in a subset of FOD. Among them, here we recall the definitions of belief function, plausibility function and pignistic probability transformation (PPT).
Definition 3.
(Belief function and plausibility function) A BPA m can also be represented by the belief function Bel or the plausibility function Pl, defined as follows:
B e l ( A ) = B A m ( B ) , P l ( A ) = B A m ( B ) .
Definition 4.
Given a BPA m on a FOD X, the pignistic probability transformation (PPT) of A X is defined as [20]
P P T ( A ) = B : A B m ( B ) | B | .

Some Uncertainty Measures for the Dempster–Shafer Framework

In the context of the DST, there are interesting measures of discrimination, such as Deng entropy; it has many advantages in some cases, in comparison with other uncertainty measures in the DST framework. It was this latter concept that has suggested to us the introduction of a new extension. In Table 1, we present the definitions of some of the most important measures of uncertainty in DST.
Definition 5.
(Deng entropy) Deng entropy was introduced in [8] for a BPA m as
E d ( m ) = A X : m ( A ) > 0 m ( A ) log 2 m ( A ) 2 | A | 1 ,
where | A | denotes the cardinality of the focal element A.
Deng entropy degenerates to the Shannon entropy if, and only if, a positive mass function value is assigned only to singleton elements, which is E d ( m ) = i = 1 X m ( { θ i } ) log 2 m ( { θ i } ) . Deng entropy has attracted the interest of researchers, and several of its generalizations have been studied. In Table 2, we present some modified versions of Deng entropy.

3. Fractional Deng Entropy

In recent years, great attention has been given to fractional calculus. For this reason, several authors have studied various fractional entropies from the idea that they satisfy physical conditions of stability. In order to obtain an analog of (6), we introduce the concept of fractional Deng entropy in the following definition.
Definition 6.
Let m be a BPA on a FOD X. We define the Fractional Deng Entropy (FDEn) of m as
E d q ( m ) = A X : m ( A ) > 0 m ( A ) log 2 m ( A ) 2 | A | 1 q , 0 < q 1 .
Example 1.
(i) 
Assume that the FOD is X = { a , b , c } . For a mass function m ( a ) = m ( b ) = m ( c ) = 1 3 , the associated fractional entropy and FDEn are obtained as follows:
S q ( p ̲ ) = E d q ( m ) = [ log 2 3 ] q .
It is obvious that, in this case, the FDEn is increasing in q ( 0 , 1 ] .
(ii) 
Assume there is a X such that m ( a ) = 1 . The associated fractional entropy and FDEn coincide and are obtained as
S q ( p ̲ ) = E d q ( m ) = 0 .
Clearly, we see that the results of fractional entropy and FDEn are identical when the BPA assigns a positive mass only to singletons. Moreover, if A X exists such that m ( A ) > 0 and A > 1 , we cannot evaluate the fractional entropy.
Example 2.
Given a FOD X = { a , b , c } , for a mass function m 1 ( a , b , c ) = 1 , we have
E d q ( m 1 ) = log 2 7 q .
For another mass function m 2 ( a ) = m 2 ( b ) = m 2 ( c ) = m 2 ( a , b ) = m 2 ( a , c ) = m 2 ( b , c ) = m 2 ( a , b , c ) = 1 7 , we obtain
E d q ( m 2 ) = 3 7 [ log 2 7 ] q + [ log 2 21 ] q + 1 7 [ log 2 49 ] q .
The plot of the FDEn as a function of q ( 0 , 1 ] is given in Figure 1. From Figure 1, it is seen that E d q ( m ) is increasing in q and the maximum is achieved for q = 1 , i.e., when the FDEn reduces to Deng entropy.
Example 3.
Assume that the FOD is X = { a 1 , a 2 , , a 20 } . For a mass function m ( { a 1 , a 2 , , a 10 } ) = 0.4 , m ( { a 11 , a 12 , , a 20 } ) = 0.6 , we obtain
E d q ( m ) = 0.4 log 2 0.4 2 10 1 q + 0.6 log 2 0.6 2 10 1 q .
The plot of this FDEn is given in Figure 2. From Figure 2, it is seen that E d q ( m ) is increasing in q, and the maximum is achieved when FDEn reduces to Deng entropy.
Example 4.
Let us consider a FOD X = { a , b , c } and a BPA m such that m ( a ) = p a and m ( a , b ) = r a , where r a = 1 p a . For p a { 0.01 , 0.8 , 0.99 } , the function E d q ( m ) is computed. In this example, it is shown that the E d q ( m ) can be increasing, decreasing and upside-down bath-tubed shaped. The FDEn is given by
E d q ( m ) = p a log 2 ( 1 / p a ) q + r a log 2 ( 3 / r a ) q .
In Figure 3, the plot of E d q ( m ) for different values of p a is given. It is seen that for p a = 0.01 , p a = 0.80 and p a = 0.99 , the plot of E d q ( m ) is increasing, upside-down bathtub shaped and decreasing, respectively.
In the above examples, it is seen that the function E d q ( m ) cannot be a concave function, and it can be increasing, decreasing and upside-down bathtub shape. Furthermore, the supremum FDEn is achieved when q is near to the boundary of interval ( 0 , 1 ] . Therefore, we can state the following theorem.
Theorem 1.
Let m be a non-degenerate BPA on a FOD X and q ( 0 , 1 ] . Then, the supremum FDEn as a function of q is attained for q { 0 , 1 } and the infimum is attained in the extremes of interval ( 0 , 1 ) , or it is a minimum assumed in a unique q 0 ( 0 , 1 ) .
Proof. 
By noting that for fixed x > 0 the function g ( p ) = x p is a convex function of p we can conclude that the FDEn is a strictly convex function of q. Hence, we have three possible scenarios. In the first one, the FDEn is strictly increasing in q and hence it assumes the maximum value for q = 1 , i.e., when it reduces to Deng entropy, and the infimum is 1 by the normalization condition. In the second scenario, the FDEn is strictly decreasing; hence, the supremum is 1 and the minimum is assumed for q = 1 . In the third case, there is a unique stationary point in ( 0 , 1 ) , it is an absolute minimum, whereas the supremum is given by max { 1 , E d ( m ) } . □
In the following theorem, we study the maximum FDEn for a fixed value of q. This is an important issue in the theory of measures of uncertainty; see, for instance, [30] for the study of the maximum Deng entropy.
Theorem 2.
Let X be a FOD, q ( 0 , 1 ] and m be a BPA, which assigns positive mass to each non-empty subset of X. The maximum FDEn is attained if the BPA m is defined as
m ( A ) = 2 | A | 1 B X ( 2 | B | 1 ) , A X .
Proof. 
For a fixed q ( 0 , 1 ] the FDEn is given by (7) as
E d q ( m ) = A X m ( A ) log 2 m ( A ) 2 | A | 1 q .
We have to maximize (9) subject to the constraint
A X m ( A ) = 1 .
We use the method of Lagrange multipliers, and we have to compute the partial derivatives of the function
E ˜ d q = A X m ( A ) log 2 m ( A ) 2 | A | 1 q + λ A X m ( A ) 1
with respect to m ( A ) . By differentiating E ˜ d q with respect to m ( A ) , we have
E ˜ d q m ( A ) = log 2 m ( A ) 2 | A | 1 q q log 2 ( e ) log 2 m ( A ) 2 | A | 1 q 1 + λ = log 2 m ( A ) 2 | A | 1 q 1 q log 2 ( e ) log 2 m ( A ) 2 | A | 1 + λ .
In order to vanish all the partial derivatives of E ˜ d q , the ratio m ( A ) 2 | A | 1 = K has to be invariant with respect to A. In fact, the function
g ( z ) = log 2 ( z ) q 1 q log 2 ( e ) log 2 ( z )
is strictly decreasing in z ( 0 , 1 ) since
g ( z ) = q log 2 ( e ) z log 2 ( z ) q 2 ( q 1 ) log 2 ( e ) + log 2 ( z )
and z < e 1 q . Hence, by the constraint (10), we get
K = 1 B X ( 2 | B | 1 )
and the BPA m, which maximizes the FDEn, is given in (8). □
Example 5.
Based on the result of Theorem 2, let us evaluate the maximum FDEn for a FOD of cardinality 3, X = { a , b , c } . In this case, the BPA given in (8) is defined as
m ( a ) = m ( b ) = m ( c ) = 1 19 , m ( a , b ) = m ( a , c ) = m ( b , c ) = 3 19 , m ( X ) = 7 19 .
Then, the maximum FDEn is given by
E d q ( m ) = [ log 2 ( 19 ) ] q .

4. Fractional Deng Extropy

In the following definition, we present the Deng extropy introduced by Buono and Longobardi [9] as a dual measure of uncertainty to Deng entropy.
Definition 7.
(Deng Extropy) Deng extropy was introduced in [9] for a BPA m on a FOD X as
E X d ( m ) = A X : m ( A ) > 0 ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 ,
where A c is the complementary of A in X and | A c | = | X | | A | .
Now, in analogy with FDEn, we introduce the fractional version of Deng extropy.
Definition 8.
Let m be a BPA on a FOD X. We define the Fractional Deng Extropy (FDEx) of m as
E X d q ( m ) = A X : m ( A ) > 0 ( 1 m ( A ) ) log 2 1 m ( A ) 2 | A c | 1 q .
Example 6.
(i) 
Assume that the FOD is X = { a , b , c } . For a mass function m ( a ) = m ( b ) = m ( c ) = 1 3 , the associated FDEx is obtained as follows:
E X d q ( m ) = 2 log 2 9 2 q .
Based on this BPA, we have obtained the FDEn in Example 1. In Figure 4, the plot of E X d q ( m ) E d q ( m ) is given.
One can see that E X d q ( m ) E d q ( m ) is an increasing function of q, and this function is greater than 1. Thus, for q ( 0 , 1 ] , the FDEx is greater than the FDEn. Furthermore, E X d q ( m ) is increasing in q and the maximum is achieved for q = 1 , i.e., when FDEx reduces to Deng extropy.
(ii) 
Assume there is a X such that m ( a ) = 1 . Then,
E X d q ( m ) = 0 .
In this case, the FDEx is consistent with its dual definition FDEn.
Example 7.
Let us consider a FOD X = { a , b , c } . For a mass function m ( a ) = m ( b ) = m ( c ) = m ( a , b ) = m ( a , c ) = m ( b , c ) = m ( a , b , c ) = 1 7 , we obtain
E d q ( m ) = 18 7 [ log 2 7 1 ] q + [ log 2 7 log 2 6 ] q .
In Figure 5, the plot of E X d q ( m ) is given. One can see that as a function of q, it has a convex parabolic shape and the maximum is achieved when it reduces to Deng extropy.
Example 8.
Assume that the FOD is X = { a 1 , a 2 , , a 20 } . For a mass function m ( { a 1 , a 2 , , a 10 } ) = 0.4 , m ( { a 10 , a 11 , , a 20 } ) = 0.6 , we obtain
E X d q ( m ) = 0.6 log 2 0.6 2 10 1 q + 0.4 log 2 0.4 2 10 1 q .
In this case, FDEx and FDEn are equal.
Example 9.
Given a FOD X = { a , b , c } and a BPA m such that m ( a ) = 0.9 , m ( a , b ) = 0.01 and m ( X ) = 0.09 , we have
E X d q ( m ) = 0.1 log 2 30 q + 0.99 log 2 100 99 q .
In Figure 6, the plot of E X d q ( m ) is given. One can see that as a function of q, it has a convex parabolic shape and the maximum is achieved when q tends to zero.
Similar to FDEn, in the above examples, it is seen that the function E X d q ( m ) cannot be a concave and it can be increasing, decreasing and upside-down bathtub shape. Furthermore, the supremum FDEx is achieved when q is near the boundary of interval ( 0 , 1 ] . The following theorem is immediate.
Theorem 3.
Let m be a non-degenerate BPA on a FOD X and q ( 0 , 1 ] . Then, the supremum FDEx as a function of q is attained for q { 0 , 1 } and the infimum is attained in the extremes of interval ( 0 , 1 ) or it is a minimum assumed in a unique q 0 ( 0 , 1 ) .
Proof. 
The proof is similar to that of Theorem 1; in this case, the supremum is given by max { N 1 + m ( X ) , E X d ( m ) } , where N is the number of focal elements different form X. □
Next, in analogy with Theorem 2, we obtain an upper bound for the maximum FDEx with a fixed value of q.
Theorem 4.
Let X be a FOD, q ( 0 , 1 ] and m be a BPA that assigns positive mass to each non-empty subset of X. For a fixed value of m ( X ) , an upper bound for the FDEx is assumed in correspondence of the fictitious BPA m ˜ such that m ˜ ( X ) = m ( X ) and
m ˜ ( A ) = 1 2 | X | 3 + m ( X ) B X ( 2 | B c | 1 ) 2 | A c | 1 , A X .
Proof. 
The proof is similar to the one given for Theorem 2. After establishing that 1 m ( A ) 2 | A c | 1 = K have to be invariant with respect to A, in order to satisfy the condition of normalization, we get
1 m ( A ) = K 2 | A c | 1
and, by summing over A X
K = 2 | X | 3 + m ( X ) A X ( 2 | A c | 1 ) .
Hence, the BPA which maximizes the FDEx is given in (12). We have to specify that it is a fictitious BPA, in the sense that m ˜ ( A ) may be negative for some subset of X. □
Example 10.
Based on the result of Theorem 4, let us evaluate the upper bound for FDEx in the case | X | = 3 with fixed m ( X ) . We have three subsets of cardinality one and three of cardinality two, and then the upper bound in given by
U = 3 · 3 ( 5 + m ( X ) ) 12 log 2 5 + m ( X ) 12 q + 3 · 5 + m ( X ) 12 log 2 5 + m ( X ) 12 q = ( 5 + m ( X ) ) log 2 12 5 + m ( X ) q .

5. Application to a Problem of Classification

In this section, we apply FDEn and FDEx to a problem of classification. We analyze a dataset given in [31] about typical qualities of Italian wines. This dataset is composed of 178 instances and, for each one, thirteen attributes are given. The instances of the dataset are divided into three classes of wine: class 1, class 2 and class 3. We use six attributes to discriminate for each instance the correct class. In particular, the attributes involved in this example are: Alcohol, Malic acid, Ash, OD280/OD315 of diluted wines (OD), Color intensity (CI) and Proline. We use the method of max–min values to generate a model of interval numbers. In particular, for a fixed attribute, we study the interval of variability in a single class, and then we intersect the intervals of more classes. The model of interval numbers is shown in Table 3.
Suppose the selected instance is ( 13.860 , 1.5100 , 2.6700 , 3.1600 , 3.3800 , 410 ) . From the dataset, we know that the selected instance belongs to class 2, and our purpose is to classify it in the right way. We generate six BPAs, one for each attribute, by using a method based on the similarity of interval numbers proposed by Kang et al. [32]. Given two intervals A = [ a 1 , a 2 ] and B = [ b 1 , b 2 ] , their similarity S ( A , B ) can be defined as
S ( A , B ) = 1 1 + α D ( A , B ) ,
where α > 0 is the coefficient of support, here we use α = 5 , and D ( A , B ) is the distance of intervals A and B defined in [33] as
D 2 ( A , B ) = a 1 + a 2 2 b 1 + b 2 2 2 + 1 3 a 2 a 1 2 2 + b 2 b 1 2 2 .
For each attribute, we can get seven values of similarity by choosing as A the intervals given in Table 3 and as B the corresponding singleton of the selected instance. Then, by normalizing the obtained values, we get six BPAs, as reported in Table 4.
Without any additional information, we can evaluate a final BPA giving the same weight to each attribute, i.e., by summing the six values related to a focal element and then dividing by six. In this way, we get the final BPA shown in Table 5.
Now, based on the BPA in Table 5, we can evaluate the PPT (5) of the classes, and we get
P P T ( 1 ) = 0.3500 , P P T ( 2 ) = 0.3464 , P P T ( 3 ) = 0.3036 .
Hence, the focal element with the highest PPT is class 1, and so, it would be our final hypothesis without making the correct decision.
We try to improve the described method by using FDEn. Let us fix the value q = 0.6 . We evaluate the FDEn of BPAs given in Table 4 and we obtain the results shown in Table 6.
Since a higher value of FDEn means a higher uncertainty, we can give more weight to the attributes with lower FDEn. In particular, we define the weights by normalizing to 1 the reciprocal values of fractional Deng entropies. We obtain the weights presented in Table 7.
Based on the weights in Table 7, we get a weighted version of the final BPA, as shown in Table 8.
Finally, based on the BPA in Table 8, we evaluate the PPT of the classes and we get
P P T ( 1 ) = 0.3426 , P P T ( 2 ) = 0.3499 , P P T ( 3 ) = 0.3075 .
Hence, the focal element with the highest PPT is class 2, so it is our final hypothesis and we made the correct decision.
Along the same lines, we can use FDEx. In Table 9, we give the recognition rates of the non-weighted method and methods based on FDEn and FDEx for different choices of q.

6. Conclusions

In this paper, fractional Deng entropy and extropy have been defined from the definitions of Deng entropy and extropy. These measures have been compared with other well-known ones, and some examples have been proposed. Characterization results for the maximum fractional Deng entropy and extropy have been given, and finally, a problem of classification based on a dataset has been discussed in order to emphasize the relevance of these measures in pattern recognition.

Author Contributions

The authors contributed equally to this paper working together to conceptualize and apply their new definitions. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Publicly available datasets were analyzed in this study. This data can be found here: http://archive.ics.uci.edu/ml (accessed on: 18 April 2021).

Acknowledgments

Francesco Buono and Maria Longobardi are members of the research group GNAMPA of INdAM (Istituto Nazionale di Alta Matematica) and are partially supported by MIUR-PRIN 2017, project “Stochastic Models for Complex Systems”, no. 2017 JFFHSH.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BPABasic probability assignment
CIColor intensity
DSTDempster–Shafer theory of evidence
FDEnFractional Deng Entropy
FDExFractional Deng Extropy
FODFrame of discernment
ODOD280/OD315 of diluted wines
PPTPignistic probability transformation

References

  1. Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–432. [Google Scholar] [CrossRef] [Green Version]
  2. Lad, F.; Sanfilippo, G.; Agrò, G. Extropy: Complementary dual of entropy. Stat. Sci. 2015, 30, 40–58. [Google Scholar] [CrossRef]
  3. Balakrishnan, N.; Buono, F.; Longobardi, M. On weighted extropies. Commun. Stat. Theory Methods 2020. [Google Scholar] [CrossRef]
  4. Jahanshani, S.M.A.; Zarei, H.; Khammar, A.H. On cumulative residual extropy. Probab. Eng. Informational Sci. 2019. [Google Scholar] [CrossRef]
  5. Kamari, O.; Buono, F. On extropy of past lifetime distribution. Ric. Mat. 2020. [Google Scholar] [CrossRef]
  6. Qiu, G.; Jia, K. The residual extropy of order statistics. Stat. Probab. Lett. 2018, 133, 15–22. [Google Scholar] [CrossRef]
  7. Ubriaco, M.R. Entropies based on fractional calculus. Phys. Lett. A 2009, 373, 2516–2519. [Google Scholar] [CrossRef] [Green Version]
  8. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  9. Buono, F.; Longobardi, M. A dual measure of uncertainty: The Deng Extropy. Entropy 2020, 22, 582. [Google Scholar] [CrossRef]
  10. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. Ann. Math. Stat. 1967, 38, 325–339. [Google Scholar] [CrossRef]
  11. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976. [Google Scholar]
  12. Han, Y.; Deng, Y. An enhanced fuzzy evidential DEMATEL method with its application to identify critical success factors. Soft Comput. 2018, 22, 5073–5090. [Google Scholar] [CrossRef]
  13. Liu, Z.; Pan, Q.; Dezert, J.; Han, J.W.; He, Y. Classifier fusion with contextual reliability evaluation. IEEE Trans. Cybern. 2018, 48, 1605–1618. [Google Scholar] [CrossRef]
  14. Fu, C.; Yang, J.B.; Yang, S.L. A group evidential reasoning approach based on expert reliability. Eur. J. Oper. Res. 2015, 246, 886–893. [Google Scholar] [CrossRef]
  15. Yang, J.B.; Xu, D.L. Evidential reasoning rule for evidence combination. Artif. Intell. 2013, 205, 1–29. [Google Scholar] [CrossRef]
  16. Kabir, G.; Tesfamariam, S.; Francisque, A.; Sadiq, R. Evaluating risk of water mains failure using a Bayesian belief network model. Eur. J. Oper. Res. 2015, 240, 220–234. [Google Scholar] [CrossRef]
  17. Liu, H.C.; You, J.X.; Fan, X.J.; Lin, Q.L. Failure mode and effects analysis using D numbers and grey relational projection method. Expert Syst. Appl. 2014, 41, 4670–4679. [Google Scholar] [CrossRef]
  18. Yin, L.; Deng, X.; Deng, Y. The negation of a basic probability assignment. IEEE Trans. Fuzzy Syst. 2019, 27, 135–143. [Google Scholar] [CrossRef]
  19. Jiang, W. A correlation coefficient for belief functions. Int. J. Approx. Reason. 2018, 103, 94–106. [Google Scholar] [CrossRef] [Green Version]
  20. Smets, P. Data fusion in the transferable belief model. In Proceedings of the Third International Conference on Information Fusion, Paris, France, 10–13 July 2000; Volume 1, pp. PS21–PS33. [Google Scholar]
  21. Hohle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th IEEE International Symposium on Multiple-Valued Logic, Paris, France, 10–12 May 1982; pp. 167–169. [Google Scholar]
  22. Yager, R.R. Entropy and specificity in a mathematical theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  23. Dubois, D.; Prade, H. A note on measures of specificity for fuzzy sets. Int. J. Gen. Syst. 1985, 10, 279–283. [Google Scholar] [CrossRef]
  24. Klir, G.J.; Ramer, A. Uncertainty in Dempster-Shafer theory: A critical re-examination. Int. J. Gen. Syst. 1991, 18, 155–166. [Google Scholar] [CrossRef]
  25. Klir, G.J.; Parviz, B.A. Note on the measure of discord. In Proceedings of the Eighth International Conference on Uncertainty in Artificial Intelligence, Stanford, CA, USA, 17–19 July 1992; pp. 138–141. [Google Scholar]
  26. George, T.; Pal, N.R. Quantification of conflict in Dempster-Shafer framework: A new approach. Int. J. Gen. Syst. 1996, 24, 407–423. [Google Scholar] [CrossRef]
  27. Zhou, D.; Tang, Y.; Jiang, W. A modified belief entropy in Dempster-Shafer framework. PLoS ONE 2017, 12, e0176832. [Google Scholar] [CrossRef] [PubMed]
  28. Pan, L.; Deng, Y. A new belief entropy to measure uncertainty of basic probability assignments base on belief function and plausibility function. Entropy 2018, 20, 842. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Cui, H.; Liu, Q.; Zhang, J.; Kang, B. An improved Deng entropy and its application in pattern recognition. IEEE Access 2019, 7, 18283–18292. [Google Scholar] [CrossRef]
  30. Kang, B.; Deng, Y. The Maximum Deng Entropy. IEEE Access 2019, 7, 120758–120765. [Google Scholar] [CrossRef]
  31. Dua, D.; Graff, C. UCI Machine Learning Repository. 2019. Available online: http://archive.ics.uci.edu/ml (accessed on 18 April 2021).
  32. Kang, B.Y.; Li, Y.; Deng, Y.; Zhang, Y.J.; Deng, X.Y. Determination of basic probability assignment based on interval numbers and its application. Acta Electron. Sin. 2012, 40, 1092–1096. [Google Scholar]
  33. Tran, L.; Duckstein, L. Comparison of fuzzy numbers using a fuzzy distance measure. Fuzzy Sets Syst. 2002, 130, 331–341. [Google Scholar] [CrossRef]
Figure 1. Plot of E d q ( m 2 ) in Example 2 as a function of q.
Figure 1. Plot of E d q ( m 2 ) in Example 2 as a function of q.
Entropy 23 00623 g001
Figure 2. Plot of E d q ( m ) in Example 3 as a function of q.
Figure 2. Plot of E d q ( m ) in Example 3 as a function of q.
Entropy 23 00623 g002
Figure 3. Plot of E d q ( m ) in Example 4 as a function of q for different values of p a .
Figure 3. Plot of E d q ( m ) in Example 4 as a function of q for different values of p a .
Entropy 23 00623 g003
Figure 4. Plot of E X d q ( m ) E d q ( m ) in Example 6 as a function of q.
Figure 4. Plot of E X d q ( m ) E d q ( m ) in Example 6 as a function of q.
Entropy 23 00623 g004
Figure 5. Plot of E X d q ( m ) in Example 7 as a function of q.
Figure 5. Plot of E X d q ( m ) in Example 7 as a function of q.
Entropy 23 00623 g005
Figure 6. Plot of E X d q ( m ) in Example 9 as a function of q.
Figure 6. Plot of E X d q ( m ) in Example 9 as a function of q.
Entropy 23 00623 g006
Table 1. Uncertainty measures in the DST framework.
Table 1. Uncertainty measures in the DST framework.
Uncertainty MeasureDefinition
Hohle’s confusion measure [21] C H ( m ) = A X m ( A ) log 2 B e l ( A )
Yager’s Dissonance Measure [22] E Y ( m ) = A X m ( A ) log 2 P l ( A )
Dubois and Prade’sWeighted Hartley Entropy [23] E D P ( m ) = A X m ( A ) log 2 | A |
Klir and Ramer’s discord measure [24] D K R ( m ) = A X m ( A ) log 2 B X m ( B ) | A B | | B |
Klir and Parviz’s strife measure [25] S K P ( m ) = A X m ( A ) log 2 B X m ( B ) | A B | | A |
George and Pal’s total conflict measure [26] T C G P ( m ) = A X m ( A ) B X m ( B ) 1 | A B | | A B |
Table 2. Modified Deng entropy in the DST framework.
Table 2. Modified Deng entropy in the DST framework.
Uncertainty MeasureDefinition
Zhou et al.’s Entropy [27] E M d ( m ) = A X m ( A ) log 2 m ( A ) 2 | A | 1 e | A | 1 | X |
Pan et al.’s Entropy [28] P B e l ( m ) = A X B e l ( A ) + P l ( A ) 2 log 2 P l ( A ) + B e l ( A ) 2 | A | 1
Cui et al.’s Entropy [29] E ( m ) = A X m ( A ) log 2 m ( A ) 2 | A | 1 e B X , B A | A B | 2 | X | 1
Table 3. The model of interval numbers.
Table 3. The model of interval numbers.
ClassAlcoholMalic AcidAshODCIProline
1 [ 12.850 , 14.830 ] [ 1.3500 , 4.0400 ] [ 2.0400 , 3.2200 ] [ 2.5100 , 4.0000 ] [ 3.5200 , 8.9000 ] [ 680 , 1680 ]
2 [ 11.030 , 13.860 ] [ 0.7400 , 5.8000 ] [ 1.3600 , 3.2300 ] [ 1.5900 , 3.6900 ] [ 1.2800 , 6.0000 ] [ 278 , 985 ]
3 [ 12.200 , 14.340 ] [ 1.2400 , 5.6500 ] [ 2.1000 , 2.8600 ] [ 1.2700 , 2.4700 ] [ 3.8500 , 13.0000 ] [ 415 , 880 ]
1 , 2 [ 12.850 , 13.860 ] [ 1.3500 , 4.0400 ] [ 2.0400 , 3.2200 ] [ 2.5100 , 3.6900 ] [ 3.5200 , 6.0000 ] [ 680 , 985 ]
1 , 3 [ 12.850 , 14.340 ] [ 1.3500 , 4.0400 ] [ 2.1000 , 2.8600 ] [ 3.8500 , 8.9000 ] [ 680 , 880 ]
2 , 3 [ 12.200 , 13.860 ] [ 1.2400 , 5.6500 ] [ 2.1000 , 2.8600 ] [ 1.5900 , 2.4700 ] [ 3.8500 , 6.0000 ] [ 415 , 880 ]
1 , 2 , 3 [ 12.850 , 13.860 ] [ 1.3500 , 4.0400 ] [ 2.1000 , 2.8600 ] [ 3.8500 , 6.0000 ] [ 680 , 880 ]
Table 4. BPAs based on Kang’s method.
Table 4. BPAs based on Kang’s method.
ClassAlcoholMalic AcidAshODCIProline
m ( 1 ) 0.1699 0.1685 0.1416 0.2700 0.0967 0.0623
m ( 2 ) 0.0715 0.1095 0.0897 0.1732 0.2088 0.1700
m ( 3 ) 0.1244 0.1083 0.1568 0.1126 0.0562 0.1877
m ( 1 , 2 ) 0.1675 0.1685 0.1416 0.3168 0.1889 0.1187
m ( 1 , 3 ) 0.1860 0.1685 0.1568 0.0000 0.0939 0.1368
m ( 2 , 3 ) 0.1132 0.1083 0.1568 0.1273 0.1777 0.1877
m ( 1 , 2 , 3 ) 0.1675 0.1685 0.1568 0.0000 0.1777 0.1368
Table 5. Final BPA.
Table 5. Final BPA.
ClassFinal BPA
m ( 1 ) 0.1515
m ( 2 ) 0.1371
m ( 3 ) 0.1243
m ( 1 , 2 ) 0.1837
m ( 1 , 3 ) 0.1237
m ( 2 , 3 ) 0.1452
m ( 1 , 2 , 3 ) 0.1345
Table 6. Fractional Deng entropies of BPAs in Table 4.
Table 6. Fractional Deng entropies of BPAs in Table 4.
AttributeAlcoholMalic AcidAshODCIProline
F D E n 2.26842.26582.26381.88012.24941.4378
Table 7. The weights of attributes based on FDEn.
Table 7. The weights of attributes based on FDEn.
AttributeAlcoholMalic AcidAshODCIProline
Weight0.14720.14730.14740.17750.14840.2322
Table 8. Final weighted BPA.
Table 8. Final weighted BPA.
ClassFinal Weighted BPA
m ( 1 ) 0.1474
m ( 2 ) 0.1411
m ( 3 ) 0.1293
m ( 1 , 2 ) 0.1822
m ( 1 , 3 ) 0.1210
m ( 2 , 3 ) 0.1483
m ( 1 , 2 , 3 ) 0.1307
Table 9. The recognition rate.
Table 9. The recognition rate.
Non-Weighted MethodqFDEn MethodFDEx Method
93.26%0.594.38%93.26%
0.694.94%93.26%
194.38%93.26%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Kazemi, M.R.; Tahmasebi, S.; Buono, F.; Longobardi, M. Fractional Deng Entropy and Extropy and Some Applications. Entropy 2021, 23, 623. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050623

AMA Style

Kazemi MR, Tahmasebi S, Buono F, Longobardi M. Fractional Deng Entropy and Extropy and Some Applications. Entropy. 2021; 23(5):623. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050623

Chicago/Turabian Style

Kazemi, Mohammad Reza, Saeid Tahmasebi, Francesco Buono, and Maria Longobardi. 2021. "Fractional Deng Entropy and Extropy and Some Applications" Entropy 23, no. 5: 623. https://0-doi-org.brum.beds.ac.uk/10.3390/e23050623

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop