1. Introduction
The notions of entropy and mutual information are basic notions in information theory [
1] and, as is known, the customary approach is based on Shannon’s entropy [
2]. Let
be a probability distribution; Shannon’s entropy of
P is the number
where
is the Shannon function defined by
for every
Remark that it used the convention (based on continuity arguments) that
The idea of Shannon’s entropy was generalized in a natural way to the Kolmogorov–Sinai entropy
of dynamical systems [
3,
4,
5], which allows dynamical systems to be distinguished. Kolmogorov and Sinai applied the entropy
to prove that non-isomorphic Bernoulli shifts exist. Of course, the theory of Kolmogorov–Sinai entropy has many other important applications. For this reason, various proposals were made to generalize the Kolmogorov–Sinai entropy concept. In [
6], we generalized the Kolmogorov–Sinai entropy concept to the case of a fuzzy probability space [
7]. This structure represents an alternative mathematical model of probability theory for the situations when the considered events are fuzzy events, i.e., events described unclearly, vaguely. Further proposals for fuzzy generalizations of Shannon’s and Kolmogorov–Sinai entropy are presented e.g., in [
8,
9,
10,
11,
12,
13,
14,
15,
16,
17]. It is known that there exist many ways to define operations for modeling the union and intersection of fuzzy sets; an overview was listed in [
18]. We remark that while the model studied in [
6] was based on Zadeh’s fuzzy set operations [
19], in our study [
14], the Lukasiewicz fuzzy set operations were used.
Since its inception in 1965, the fuzzy set theory has been continually developing, and it has been shown to be useful in many disciplines. It has been applied to many mathematical areas, such as algebra, analysis, clustering, graph theory, measure theory, probability theory, control theory, optimization, topology, and so on. Currently, algebraic structures based on fuzzy set theory, such as MV-algebras [
20,
21,
22,
23,
24,
25,
26,
27,
28], D-posets [
29,
30,
31], effect algebras [
32,
33], and A-posets [
34,
35,
36], are intensively studied. There are also interesting results about the Kolmogorov type entropy on these structures; some of them can be found, e.g., in [
37,
38,
39,
40,
41,
42,
43]. Moreover, the fuzzy set theory also has significant practical applications; applications of this theory can be found, for example, in control engineering, data processing, management, logistics, artificial intelligence, computer science, medicine, decision theory, expert systems, logic, management science, operations research, pattern recognition, and robotics.
In 1983, Atanassov introduced a more general fuzzy theory—intuitionistic fuzzy sets theory [
44,
45,
46]. Recall that while a fuzzy set is a mapping
(where the considered fuzzy set is identified with its membership function
), the intuitionistic fuzzy set (shortly, IF-set) is a pair
of fuzzy sets for which the condition
for every
is satisfied. The function
is called the membership function of
A, the function
is called the non-membership function of
A. Evidently, each fuzzy set
can be regarded as an IF-set
Each result that applies to IF-sets also applies to the case of fuzzy sets. Of course, the opposite implication is not valid, e.g., the representation theorem of IF-states does not follow by the corresponding result for fuzzy states. The theory of IF-sets represents a non-trivial generalization of the fuzzy set theory; thus, the IF-sets provide opportunities to model a larger class of real situations. We remark that a probability theory on intuitionistic fuzzy events has been elaborated in [
47], see also [
48]. Some results about the Kolmogorov type entropy for the case of intuitionistic fuzzy sets are given e.g., in [
49,
50,
51,
52,
53].
When solving some specific problems, instead of Shannon’s entropy it is more appropriate to use an approach based on the concept of logical entropy [
54,
55,
56,
57]. If
is a probability distribution, then the logical entropy of
P is defined by the formula
In [
57], historical aspects of the logical entropy formula
are discussed and the relationship between logical entropy and Shannon’s entropy is examined. The concepts of logical conditional entropy and the logical mutual information have been introduced as well. We note that some results about the logical entropy on some of the above mentioned algebraic structures, based on fuzzy set theory, can be found e.g., in [
58,
59,
60,
61,
62].
The purpose of the present work is to study the logical entropy and logical mutual information of experiments in the intuitionistic fuzzy case. The paper is organized in the following way. In the following section, basic definitions and notations are provided. In
Section 3, the concept of logical entropy for the case of intuitionistic fuzzy experiments is introduced, and basic properties of the proposed measure are shown. In
Section 4, we introduce the concepts of logical mutual information and conditional mutual information of intuitionistic fuzzy experiments and derive some properties of these measures. In
Section 5, using the suggested concept of logical entropy, we define the logical entropy of IF-dynamical systems. It is shown that the logical entropy of IF-dynamical systems is invariant under isomorphism. Finally, an analogy of the Kolmogorov–Sinai theorem on generators for IF-dynamical systems is proved.
Section 6 contains a brief summary.
2. Basic Definitions, Notations and Facts
In this section, we provide basic definitions, notations and facts that will be used throughout the contribution.
Definition 1. By an IF-event we will understand a pair of functions with the property for every
In the following, we will use the symbol
to denote the family of all IF-events. Analogously as in the fuzzy case, there are many possibilities to define operations for modeling the union and intersection of IF-sets (see e.g., [
63,
64,
65]). We will use the operations
and
defined as follows. In the family
we define the partial binary operation
in the following way: if
and
are two IF-events, then
Here,
denotes the function defined by
for every
Similarly, we denote by
the function defined by
for every
Evidently, if
then
exists if and only if
and
The zero element of operation
is the IF-event
Indeed,
for any
Further, in the family
we define the binary operation
in the following way: if
and
then
Put
Evidently,
for any
The IF-event
is interpreted as an impossible event; the IF- event
as a certain event. It can easily be verified that, for any
the following conditions are satisfied:
- (F1)
if one side is defined in (commutativity);
- (F2)
if one side is defined in (associativity);
- (F3)
if exists, then exists, and
Since in the fuzzy case the inequality
implies
in the family
it is natural to define the relation
as follows: if
and
are two IF-events, then
if and only if
and
The relation
is a partial order such that
for all
Gutierrez Garcia and Rodabaugh have proved that intuitionistic fuzzy sets ordering and topology are reduced to the ordering and topology of fuzzy sets [
66]. Another situation is in measure theory, where the intuitionistic fuzzy case cannot be reduced to the fuzzy case.
Definition 2. A map is said to be a state if the following conditions are satisfied:
- (i)
, whenever is defined in ;
- (ii)
Example 1. Consider a probability space and put It is easy to verify that the mapping defined, for any element of by the formula: is a state. Namely, for every such that exists, we have:and Remark 1. Riečan and Ciungu have shown in [67] that any continuous state m defined on a family of all S-measurable IF-events has the form (1). In more detail, if a state m defined on a family of all S-measurable IF-events is continuous (i.e., A implies ), then there exist exactly one probability measure and exactly one such that: Definition 3. By an IF-partition of we will understand a finite collection of elements of such that exists, and
Remark 2. A classical probability space can be regarded as a family of IF-events, if we put where is the characteristic function of a set the mapping defined by is a state on A usual measurable partition of a space (i.e., any sequence such that and Ø
) can be regarded as an IF-partition, if we consider instead of Namely, Ø
implies for every and hence exists. Moreover, we have: and the equality implies: Definition 4. Let , be two IF-partitions of The IF-partition is said to be a refinement of (with respect to m) if for each there exists a subset such that Ø,
for and
In the case that is a refinement of we write
Denote by the family of all mappings If and are two elements of then we put and
Theorem 1. Let be a state. Then, the mapping defined, for any element of by is a state, and i.e., for any Proof. The proof can be found in [
68]. ☐
Proposition 1. Let such that Then, for any
Proof. Put
Then:
hence,
From the monotonicity of
it follows
Proposition 2. Let be an IF-partition of Then for any
Proof. Since
by Proposition 1 and (F3) we get:
Definition 5. Let , , be two IF-partitions of Their join is defined as the system , if and
Theorem 2. If are two IF-partitions of then is also an IF-partition of and .
Proof. Let , Since and exist, according to (F3) we obtain that also exists, and
Moreover, using Proposition 1 we get:
This means that is an IF-partition of
Since the system
is indexed by
we put
Since
according to Proposition 1 and (F3), for
we get:
However, this means that ☐
3. Logical Entropy of IF-Partitions
It is obvious that each IF-partition
represents, from the point of view of classical probability theory, a random experiment with a finite number of results
that are intuitionistic fuzzy events, with a probability distribution
Namely,
for
and
For that reason, we define the logical entropy of
as the number:
Since
we can also write:
Remark 3. Evidently, the IF-partition has zero logical entropy.
Example 2. Consider the measurable space where is the unit interval [0,1]
and is the algebra of all Borel subsets of [0,1].
Now, we can consider the family of all S-measurable IF-events and the state defined, for any element of by the formula: Put Since (and therefore exists), and the set is an IF-partition. It has the m-state values of the corresponding elements and the logical entropy
Some basic properties of the logical entropy of IF-partitions are listed below.
Theorem 3. Let be two IF-partitions of Then:
- (i)
- (ii)
implies
- (iii)
.
Proof. The property (i) is evident. We will prove the second property. Let
,
,
. Then, for any
there exists a subset
, such that
, for
and
. Hence, we can write:
As a consequence of the inequality
which is true for all non-negative real numbers
we get:
The inequality (iii) is a simple consequence of the previous property and Theorem 2. ☐
Definition 6. If ,
are two IF-partitions of then the conditional logical entropy of assuming a realization of the IF-experiment is defined by the formula: Remark 4. Since for the conditional logical entropy it holds that If we put then
Remark 5. Since by Proposition 2, it holds that for we can also write: Theorem 4. Let be two IF-partitions of Then Proof. Assume that
,
Let us calculate:
Remark 6. As a simple consequence of Theorem 4, we get:and according to Definition 5 we obtain that Theorem 5. Let be two IF-partitions of Then
- (i)
;
- (ii)
Proof. (i) Assume that
,
Since by Proposition 2, we have:
for
it holds:
Therefore, we get:
(ii) The property (i) along with (7) implies:
The proof is complete. ☐
Theorem 6. Let be IF-partitions of Then Proof. Let
. Then by Equation (5) we get:
Theorem 7. Let and be IF-partitions of Then
- (i)
- (ii)
Proof. (i) We shall prove the statement using mathematical induction. By Equation (7), we have:
For
using the previous equality and Theorem 6, we get:
Now let us suppose that the result is true for a given
Then
Thus, by the principle of mathematical induction, the result follows.
(ii) The proof of the second assertion is analogous; it suffices to use Theorem 6 and the principle of mathematical induction. ☐
4. Logical Mutual Information of IF-Partitions
In this section, using the results of the previous parts, we define the notions of logical mutual information and logical conditional mutual information of IF-partitions and prove basic properties of these measures. We also present some numerical examples to illustrate the results.
Definition 7. Let be two IF-partitions of Then, we define the logical mutual information of and by the formula: Remark 7. As a simple consequence of Equation (6), we have: From Equation (9), it follows that and
Theorem 8. Let be two IF-partitions of Then Proof. The non-negativity of logical mutual information follows from the subadditivity of logical entropy (the property (ii) of Theorem 5) and Equation (9). The second inequality is a consequence of Equation (9) and the property (iii) of Theorem 3. ☐
Example 3. Consider the family of IF-events from Example 2 and the state defined by the formula: Put where the functions are defined by for every and where the functions are defined by for every Evidently, the set is an IF-partition with the m-
state values of the corresponding elements, and the logical entropy Further, we put where the functions are defined by for every and where the functions are defined by for every Then, the set is an IF-partition with the m-state values of the corresponding elements and the logical entropy The join of and is the system where with the m-state value of the corresponding elements. The logical entropy of is the number: Let us calculate the logical mutual information of IF-partitions By Equation (9), we get: Theorem 9. If IF-partitions , and are independent, i.e., for then
Corollary 1. If IF-partitions are independent, then In the following part, we define the logical conditional mutual information of IF-partitions and, using this notion, we establish the chain rules for logical mutual information of IF-partitions.
Definition 8. Let be IF-partitions of Then, the logical conditional mutual information of and assuming a realization of is defined by the formula: Theorem 10. For IF-partitions of it holds: Proof. The second equality is obtained analogously. ☐
The result of the previous theorem is illustrated by the following example.
Example 4. Consider the family of IF-events from Example 2, the state defined by Equation ( 10), and the IF-partitions from the previous example. In addition, put where for every .
We will show that The join of and is the system with the m-state values of the corresponding elements. By simple calculation, we obtain:and consequently In Example 3, we have calculated that . It is now possible to verify that the equality is valid.
Theorem 11. (Chain rules for logical mutual information). Let and be IF-partitions of Then, for it holds: Proof. By (8), Theorem 7, and (11), we obtain
Definition 9. Let be IF-partitions of We say that is conditionally independent to assuming a realization of (and write ) if
Theorem 12. For IF-partitions of it holds: if and only if
Proof. Let
i.e.,
Then
and by Equation (6) we get:
However, this indicates that The reverse implication is obvious. ☐
Remark 8. According to the previous theorem, we may say that and are conditionally independent, assuming a realization of and write instead of
Theorem 13. For IF-partitions of such that we have
- (i)
- (ii)
- (iii)
Proof. (i) Since by the assumption
using the chain rule for logical mutual information, we obtain:
(ii) By Theorem 10, we have
Hence using (i), we can write:
(iii) From (ii), it follows the inequality By Theorem 12, we can interchange and . Doing so, we obtain the inequality ☐
We note that, in the classical theory, the last claim of Theorem 13 is known as the data processing inequality.
5. Logical Entropy of IF-Dynamical Systems
The classical dynamical system is a quadruplet where is a probability space, and is a measure preserving map, i.e., implies and Define by the equality for any Then, is a mapping with the property for any In addition, for any analogously, for any It is a motivation for the following definition.
Definition 10. Let be the family of all IF-events and be a state. Then, the triplet will be called an IF-dynamical system, if is such a mapping that the following conditions are satisfied:
- (i)
implies and ;
- (ii)
if and then
- (iii)
if then
Proposition 3. Let any IF-dynamical system be given. If is an IF-partition of then the system is also an IF-partition of
Proof. Since
exists, according to Definition 8,
and
This means that
exists. Moreover, we have:
and
Define and put where is the identical mapping on
Theorem 14. Let any IF-dynamical system be given. If are IF-partitions of then the following properties are satisfied:
- (i)
- (ii)
implies
- (iii)
- (iv)
- (v)
Proof. Assume that ,
The property (i) follows from the condition
(ii) If
then for each
there exists a subset
such that
Ø for
and
We get:
However, this indicates that .
(iii) Since
for
we get:
(iv) The proof is analogous to the proof of the previous property.
(v) We will prove by mathematical induction. For the case of
, the equality holds by Equation (7). We assume that the statement holds for a given
and we prove it is true for
By part (iii), we have:
Therefore, by Equation (7) and the induction assumption, we can write:
The proof is complete. ☐
Lemma 1. Let be a sequence of non-negative real numbers such that for every Then exists.
Proof. The proof can be found in [
69]. ☐
Proposition 4. Let be an IF-dynamical system, and be an IF-partition of Then, there exists the following limit: Proof. Put
According to Theorem 5 and property (iii) of the previous theorem, for every
we have:
Hence, by Lemma 1, exists. ☐
Definition 11. Let be an IF-dynamical system, and be any IF-partition of The logical entropy of with respect to is defined by: The logical entropy of an IF-dynamical system is defined by the formula: Example 5. Let be the family of all IF-events and be a state. Then, the triplet where is an identity mapping, is a trivial case of an IF-dynamical system. The operation is idempotent, therefore:and the logical entropy of is Theorem 15. Let any IF-dynamical system be given. If are IF-partitions of such that then
Proof. If then for By property (ii) from Theorem 3, we have for Hence, ☐
Definition 12. Two IF-dynamical systems are said to be isomorphic if there exists a bijective mapping satisfying the following conditions:
- (i)
for every ;
- (ii)
for every ;
- (iii)
for every , exists if and only if exists, and
- (iv)
for every .
Lemma 2. Let be isomorphic IF-dynamical systems wherein a mapping represents their isomorphism. Let be an IF-partition of Then, the system is an IF-partition of with the logical entropy and moreover,
Proof. Since
exists, by condition (iii) of the previous definition
exists, and it holds
Therefore, by condition (iv) of the previous definition, we can write:
On the other hand,
This means that
is an IF-partition of
. Let us calculate:
Consequently, using conditions (i) and (ii) of the previous definition, we get:
Lemma 3. Let be isomorphic IF-dynamical systems wherein a mapping represents their isomorphism. Then, for the inverse the following properties are satisfied:
- (i)
for every
- (ii)
for any if exists, then exists, too, and
- (iii)
for every
- (iv)
for every
Proof. Since is bijective, for every there exist such that
- (i)
- (ii)
Let
such that
exists. Then,
exists because
is surjective. Let us calculate:
- (iii)
- (iv)
Let
Then we have
and
Hence, the equality holds. ☐
Theorem 16. If the IF-dynamical systems are isomorphic, then
Proof. Let
be a mapping representing an isomorphism of IF-dynamical systems
By Lemma 2, if
is an IF-partition of
then the system
is an IF-partition of
and
Therefore:
and consequently:
The opposite inequality is obtained in a similar way; it suffices to consider the inverse
If
is an IF-partition of
then it is easy to verify that
is an IF-partition of
Indeed, since
exists, according to property (ii) from Lemma 3,
exists, too. Moreover, we have:
and
By means of (iii) from the previous lemma, we get:
Thus, according to the previous lemma, we can write:
Therefore:
and consequently:
The proof is completed. ☐
In the final part, we prove an analogy of the Kolmogorov–Sinai theorem on generators for the studied situation. This theorem (see e.g., [
69]) is the main tool for calculating the entropy of dynamical system. First, analogously as in [
62], we introduce the following definition.
Definition 13. Let be an IF-dynamical system and be an IF-partition of Then is called an m-generator of if to any IF-partition of there exists an integer > 0 such that
Proposition 5. Let be an IF-dynamical system, and be an IF-partition of Then, for each natural number k, it holds Proof. Let
be any IF-partition of
Then, for each natural number
k, we can write:
Theorem 17. Let be an IF-dynamical system and be an m-generator of Then Proof. Let
be an
m-generator of
Then to any IF-partition
of
there exists an integer
> 0 such that
Consequently by Theorem 15 and Proposition 5, for every IF-partition
of
we have: