Jump to content

User:Guillaume.Aucher/sandbox: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Created page with 'Dynamic Epistemic Logic (DEL) can be used to specify and reason about information change and exchange of information in multi-agent systems. It is built on top o...'
 
No edit summary
Line 1: Line 1:
Dynamic Epistemic Logic (DEL) can be used to specify and reason about information change and exchange of information in multi-agent systems. It is built on top of epistemic logic to which it adds dynamics and events. Epistemic logic will be the topic of Section [sec:2.3.1]. Then, in Section [sec:2.3.2], actions and events will enter into the picture and we will introduce the logical framework of dynamic epistemic logic.<ref>A distinction is sometimes made between events and actions, an action being a specific type of event performed by an agent.
Dynamic Epistemic Logic (DEL) is a logic for specifying and reasoning about information change and exchange of information in multi-agent systems. It is built on top of epistemic logic to which it adds dynamics and events. Epistemic logic will first be recalled. Then, actions and events will enter into the picture and we will introduce the logical framework of dynamic epistemic logic.<ref>A distinction is sometimes made between events and actions, an action being a specific type of event performed by an agent.
</ref>
</ref>


Line 7: Line 7:


=== Syntax and Semantics ===
=== Syntax and Semantics ===

[sec:2.3.1.1]


==== Syntax. ====
==== Syntax. ====


The epistemic language is an extension of the basic modal language of Definition [def:1.2.4] with a ''common knowledge'' operator <math>C_{A}</math> and a ''distributed knowledge'' operator <math>D_{A}</math>. These new operators are discussed after the following definition.
The epistemic language is an extension of the basic multi-modal language of [[modal logic]] with a ''[[Common knowledge (logic)|common knowledge]]'' operator <math>C_{A}</math> and a ''[[distributed knowledge]]'' operator <math>D_{A}</math>. The ''epistemic language'' <math>\mathcal{L}_{\textsf{EL}}</math> is defined inductively as follows:

[Epistemic language <math>\mathcal{L}_{\textsf{EL}}</math>] The ''epistemic language'' <math>\mathcal{L}_{\textsf{EL}}</math> is defined inductively as follows:


<math>\mathcal{L}_{\textsf{EL}}:\phi~~::=~~ p~\mid~(\neg\phi)~\mid~(\phi\land\phi)~\mid~
<math>\mathcal{L}_{\textsf{EL}}:\phi~~::=~~ p~\mid~\neg\phi~\mid~(\phi\land\phi)~\mid~
K_j\phi~\mid~ C_{A}\phi ~\mid~ D_{A}\phi</math>
K_j\phi~\mid~ C_{A}\phi ~\mid~ D_{A}\phi</math>


where <math>p\in PROP</math>, <math>j\in {AGTS}</math> and <math>A\subseteq {AGTS}</math>. The formula <math>\langle K_{j}\rangle\alpha</math> is an abbreviation for <math>\neg\Box_j\neg\alpha</math>, <math>E_{A}\phi</math> is an abbreviation for <math>\bigwedge\limits_{j\in A} K_j\phi</math> and <math>C\phi</math> an abbreviation for <math>C_{AGTS}\phi</math>.
where <math>p\in PROP</math>, <math>j\in {AGTS}</math> and <math>A\subseteq {AGTS}</math>. The formula <math>\langle K_{j}\rangle\alpha</math> is an abbreviation for <math>\neg K_j\neg\alpha</math>, <math>E_{A}\phi</math> is an abbreviation for <math>\bigwedge\limits_{j\in A} K_j\phi</math> and <math>C\phi</math> an abbreviation for <math>C_{AGTS}\phi</math>.


==== Group notions: general, common and distributed knowledge. ====
==== Group notions: general, common and distributed knowledge. ====
Line 41: Line 37:
Epistemic logic is a modal logic. So, what we call an ''epistemic model'' <math>\mathcal{M}=(W,\R_1,\ldots,\R_n,V)</math> is just a Kripke model as used in modal logic. The possible worlds <math>W</math> are the relevant worlds needed to define such a representation and the valuation <math>V</math> specifies which propositional facts (such as ‘it is raining’) are true in these worlds. Finally the accessibility relations <math>R_j</math> can model either the notion of knowledge or the notion of belief. We set <math>w'\in R_j(w)</math> in case the world <math>w'</math> is compatible with agent <math>j</math>’s belief (respectively knowledge) in world <math>w</math>. Intuitively, a ''pointed epistemic model'' <math>(\mathcal{M},w_a)</math>, where <math>w_a\in\mathcal{M}</math>, represents from an external point of view how the actual world <math>w_a</math> is perceived by the agents <math>{AGTS}</math>.
Epistemic logic is a modal logic. So, what we call an ''epistemic model'' <math>\mathcal{M}=(W,\R_1,\ldots,\R_n,V)</math> is just a Kripke model as used in modal logic. The possible worlds <math>W</math> are the relevant worlds needed to define such a representation and the valuation <math>V</math> specifies which propositional facts (such as ‘it is raining’) are true in these worlds. Finally the accessibility relations <math>R_j</math> can model either the notion of knowledge or the notion of belief. We set <math>w'\in R_j(w)</math> in case the world <math>w'</math> is compatible with agent <math>j</math>’s belief (respectively knowledge) in world <math>w</math>. Intuitively, a ''pointed epistemic model'' <math>(\mathcal{M},w_a)</math>, where <math>w_a\in\mathcal{M}</math>, represents from an external point of view how the actual world <math>w_a</math> is perceived by the agents <math>{AGTS}</math>.


[Satisfaction relation] For every epistemic model <math>\mathcal{M}</math>, <math>w\in \mathcal{M}</math> and <math>\phi\in\mathcal{L}_{\textsf{EL}}</math>, define
For every epistemic model <math>\mathcal{M}</math>, <math>w\in \mathcal{M}</math> and <math>\phi\in\mathcal{L}_{\textsf{EL}}</math>, define


{|
{|

Revision as of 19:12, 10 December 2015

Dynamic Epistemic Logic (DEL) is a logic for specifying and reasoning about information change and exchange of information in multi-agent systems. It is built on top of epistemic logic to which it adds dynamics and events. Epistemic logic will first be recalled. Then, actions and events will enter into the picture and we will introduce the logical framework of dynamic epistemic logic.[1]

Epistemic Logic

Epistemic logic is a modal logic that is concerned with the logical study of the notions of knowledge and belief. It is thereby concerned with understanding the process of reasoning about knowledge and belief: which principles relating the notions of knowledge and belief are intuitively plausible ? As epistemology, it stems from the Greek word or ‘episteme’ meaning knowledge. But epistemology is more concerned with analyzing the very nature of knowledge (addressing questions such as “What is the definition of knowledge?” or “How is knowledge acquired?”). In fact, epistemic logic grew out of epistemology in the middle ages thanks to the efforts of Burley and Ockham . But the formal work, based on modal logic, that inaugurated contemporary research into epistemic logic dates back only to 1962 and is due to Hintikka . It then sparked in the 1960’s discussions about the principles of knowledge and belief and many axioms for these notions were proposed and discussed . For example, the interaction axioms and are often considered to be intuitive principles: if agent knows then (s)he also believes , or if agent believes , then (s)he knows that (s)he believes . More recently, these kinds of philosophical theories were taken up by researchers in economics , artificial intelligence and theoretical computer science where reasoning about knowledge is a central topic. Due to the new setting in which epistemic logic was used, new perspectives and new features such as computability issues were then added to the agenda of epistemic logic.

Syntax and Semantics

Syntax.

The epistemic language is an extension of the basic multi-modal language of modal logic with a common knowledge operator and a distributed knowledge operator . The epistemic language is defined inductively as follows:

where , and . The formula is an abbreviation for , is an abbreviation for and an abbreviation for .

Group notions: general, common and distributed knowledge.

In a multi-agent setting there are three important epistemic concepts: general belief (or knowledge), distributed belief (or knowledge) and common belief (or knowledge). The notion of common belief (or knowledge) was first studied by Lewis in the context of conventions . It was then applied to distributed systems and to game theory , where it allows to express that the rationality of the players, the rules of the game and the set of players are commonly known.

General knowledge.

General belief of means that everybody in the group of agents believes that . Formally this corresponds to the following formula:

Common knowledge.

Common belief of means that everybody believes but also that everybody believes that everybody believes , that everybody believes that everybody believes that everybody believes , and so on ad infinitum. Formally, this corresponds to the following formula

As we do not allow infinite conjunction the notion of common knowledge will have to be introduced as a primitive in our language.

Before defining the language with this new operator, we are going to give an example introduced by that illustrates the difference between these two notions (here we exceptionally use the notion of knowledge instead of belief to make things clearer). Lewis wanted to know what kind of knowledge is needed so that the statement : “every driver must drive on the right” be a convention among a group of agents. In other words he wanted to know what kind of knowledge is needed so that everybody feels safe to drive on the right. Suppose there are only two agents and . Then everybody knowing (formally ) is not enough. Indeed, it might still be possible that the agent considers possible that the agent does not know (formally ). In that case the agent will not feel safe to drive on the right because he might consider that the agent , not knowing , could drive on the left. To avoid this problem, we could then assume that everybody knows that everybody knows that (formally ). This is again not enough to ensure that everybody feels safe to drive on the right. Indeed, it might still be possible that agent considers possible that agent considers possible that agent does not know (formally ). In that case and from ’s point of view, considers possible that , not knowing , will drive on the left. So from ’s point of view, might drive on the left as well (by the same argument as above). So will not feel safe to drive on the right. Reasoning by induction, Lewis showed that for any , is not enough for the drivers to feel safe to drive on the right. In fact what we need is an infinite conjunction. In other words, we need common knowledge of : .

Distributed knowledge.

Distributed knowledge of means that if the agents pulled their knowledge altogether, they would know that holds. In other words, the knowledge of is distributed among the agents. The formula reads as ‘it is distributed knowledge among the set of agents that holds’.

Semantics.

Epistemic logic is a modal logic. So, what we call an epistemic model Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{M}=(W,\R_1,\ldots,\R_n,V)} is just a Kripke model as used in modal logic. The possible worlds are the relevant worlds needed to define such a representation and the valuation specifies which propositional facts (such as ‘it is raining’) are true in these worlds. Finally the accessibility relations can model either the notion of knowledge or the notion of belief. We set in case the world is compatible with agent ’s belief (respectively knowledge) in world . Intuitively, a pointed epistemic model , where , represents from an external point of view how the actual world is perceived by the agents .

For every epistemic model , and , define

,wC_A
,wD_A

where is the transitive closure of : we have that if, and only if, there are and Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle j_1,\ldots,j_n\in A} such that and for all , Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle w_{i-1}\R_{j_i} w_i} .

Despite the fact that the notion of common belief has to be introduced as a primitive in the language, we can notice in this definition that epistemic models do not have to be modified in order to give truth value to the common knowledge and distributed knowledge operators.

[Card example][example9.1]

Ann (), Bob () and Claire () play a card game with three cards: a one, a one and a one. Each of them has a single card but they do not know the cards of the other players. This example is depicted in Figure [fig1].

The notion of knowledge might comply to some constraints (or axioms) such as : if agent knows something, she knows that she knows it. These constraints might affect the nature of the accessibility relations which may then comply to some extra properties. So, we are now going to define some particular classes of epistemic models that all add some extra constraints on the accessibility relations . These constraints are matched by particular axioms for the knowledge operator .

[Properties of accessibility relations][def:9.2] We give in Figure [Figure2.1] a list of properties of the accessibility relations. We also give, below each property, the axiom which defines the class of epistemic frames that fulfill this property. We choose, without any particular reason, to use the knowledge modality to write these conditions.

serial:
:
transitive:
:
Euclidean:
:
reflexive:
:
symetric:
:
confluent:
:
weakly connected:
:
semi-Euclidean:
:
R1:
:

Knowledge versus Belief

In this chapter we use the same notation for both knowledge and belief. Hence, depending on the context, will either read ‘the agent Knows that holds’ or ‘the agent Believes that holds’. A crucial difference is that, unlike knowledge, beliefs can be wrong: the Truth axiom holds only for knowledge, but not necessarily for belief. In the next section, we are going to examine other axioms, some of them pertain more to the notion of knowledge whereas some others pertain more to the notion of belief.

Axiomatization

The axioms of an epistemic logic obviously display the way the agents reason. For example the axiom together with the rule of inference entail that if I know () and I know that implies ( then I know that (). Stronger constraints can be added. The following proof systems are often used in the literature.

[Proof Systems for ] We define the following proof systems for :

=
+ +
=
=
.
=
+
=
=
=

We denote by the set of proof systems .

Moreover, for all , we define the proof system by adding the following axiom schemes and rules of inference to those of . For all ,

cc & K_jD_A

& E_AK_j
& C_AE_A(C_A)
& E_A() C_A&

The relative strength of the proof systems for knowledge is as follows:

So, all the theorems of are also theorems of and . Many philosophers claim that in the most general cases, the logic of knowledge is or . Typically, in computer science, the logic of belief (doxastic logic) is taken to be and the logic of knowledge (epistemic logic) is taken to be , even if the logic is only suitable for situations where the agents do not have mistaken beliefs.
We discuss the most important axioms of Figure [Figure2.1]. Axioms and state that if the agent knows a proposition, then this proposition is true (axiom for ruth), and if the agent knows a proposition, then she knows that she knows it (axiom , also known as the “KK-principle”or “KK-thesis”). Axiom is often considered to be the hallmark of knowledge and has not been subjected to any serious attack. In epistemology, axiom tends to be accepted by internalists, but not by externalists (also see ). Axiom is nevertheless widely accepted by computer scientists (but also by many philosphers, including Plato, Aristotle, Saint Augustine, Spinoza and Shopenhauer, as Hintikka recalls ). A more controversial axiom for the logic of knowledge is axiom : This axiom states that if the agent does not know a proposition, then she knows that she does not know it. This addition of to yields the logic . Most philosophers (including Hintikka) have attacked this axiom, since numerous examples from everyday life seem to invalidate it.[2] In general, axiom is invalidated when the agent has mistaken beliefs which can be due for example to misperceptions, lies or other forms of deception. Axiom states that the agent’s beliefs are consistent. In combination with axiom (where the knowledge operator is replaced by a belief operator), axiom is in fact equivalent to a simpler axiom which conveys, maybe more explicitly, the fact that the agent’s beliefs cannot be inconsistent (Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. upstream connect error or disconnect/reset before headers. reset reason: connection termination"): {\displaystyle BBot} ): . In all the theories of rational agency developed in artificial intelligence, the logic of belief is . Note that all these agent theories follow the perfect external approach. This is at odds with their intention to implement their theories in machines. In that respect, an internal approach seems to be more appropriate since, in this context, the agent needs to reason from its own internal point of view. For the internal approach, the logic of belief is , as proved by (for the notion of full belief) and .[3]

[Classes of Epistemic Models] For all , the class of –models or –models is the class of epistemic models whose accessibility relations satisfy the properties listed in Figure [Figure2.1] defined by the axioms of or .

[Soundness and Completeness] For all , is sound and strongly complete for w.r.t. the class of –models, and is sound and strongly complete for w.r.t. the class of –models.

Decidability

All the logics introduced are decidable. We list in Figure [fig:3.4.] the complexity of the satisfiability problem for each of them. All these results are due to . Note that if the satisfiability problem for these logics becomes linear time if there are only finitely many propositional letters in the language. For , if we restrict to finite nesting, then the satisfiability problem is NP-complete for all the modal logics considered, but . If we then further restrict the language to having only finitely many primitive propositions, the complexity goes down to linear time in all cases .

with common knowledge
, Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \textsf{S4}} PSPACE PSPACE EXPTIME
NP PSPACE EXPTIME
NP PSPACE EXPTIME

The computational complexity of the model checking problem is in P in all cases.

Adding Dynamics

Dynamic Epistemic Logic (DEL) is a formalism trying to model epistemic situations involving several agents, and changes that can occur to these situations after incoming information or more generally incoming action. The methodology of DEL is such that it splits the task of representing the agents’ beliefs and knowledge into three parts:

  1. One represents their beliefs about an initial situation thanks to an epistemic model;
  2. One represents their beliefs about an event taking place in this situation thanks to an event model;
  3. One represents the way the agents update their beliefs about the situation after (or during) the occurrence of the event thanks to a product update.

[page1]

Typically, an informative event can be a public announcement to all the agents of a formula : this public announcement and correlative update constitute the dynamic part. Note that epistemic events can be much more complex than simple public announcement, including hiding information for some of the agents cheating, etc…This complexity is dealt with in Section [sec 2.2] introducing the notion of event model. In Section [sec:2.3.2.1], we will first focus on public announcements to get an intuition of the main underlying ideas that occur in DEL.

The Simple Case: Public Annoucement Logic

[sec:2.3.2.1] We start by giving a concrete example where DEL can be used, to better understand what is going on. Then, we will present a sketchy formalization of the phenomenon called Public Announcement Logic (PAL)

[Muddy children] We have two children, A and B, both dirty. A can see B but not himself, and B can see A but not herself. Let be the proposition stating that A is dirty, and be the proposition stating that B is not dirty.

  1. We represent the initial situation by the epistemic model represented in Figure [fig:2.4], where relations are equivalence relations. States intuitively represent possible worlds, a proposition (for example ) satisfiable at one of these states intuitively means that in the possible world corresponding to this state, the intuitive interpretation of ( is dirty) is true. The links between states labelled by agents (A or B) intuitively express a notion of indistinguishability for the agent at stake between two possible worlds. For example, the link between and labelled by A intuitively means that A can not distinguish the possible world from and vice versa. Indeed, A can not see himself, so he cannot distinguish between a world where he is dirty and one where he is not dirty. However, he can distinguish between worlds where B is dirty or not because he can see B. With this intuitive interpretation we are brought to assume that our relations between states are equivalence relations.
  2. Now, suppose that their father comes and announces that at least one is dirty. Then we update the model and this yields the epistemic model of Figure [fig:2.5]. What we actually do is suppressing the worlds where the content of the announcement is not fulfilled. In our case this is the world where and are true. This suppression is what we call the update. We then get the model depicted. As a result of the announcement, both A and B do know that at least one of them is dirty. We can read this from the model.
  3. Now suppose there is a second (and final) announcement that says that neither knows they are dirty (an announcement can express facts about the situation as well as epistemic facts about the knowledge held by the agents). We then update similarly the model by suppressing the worlds which do not satisfy the content of the announcement, or equivalently by keeping the worlds which do satisfy the announcement. This update process thus yields the epistemic model of Figure [fig:2.6]. By interpreting this model, we get that A and B both know that they are dirty, which seems to contradict the content of the announcement. Actually, we see here at work one of the main features of the update process: a proposition is not necessarily true after being announced. That is what we technically call “self-persistence” and this problem arises for epistemic formulas (unlike propositional formulas). One must not confuse the announcement and the update induced by this announcement, which might cancel some of the information encoded in the announcement (like in our example).

Public Announcement Logic (PAL)

Now we roughly present the main ingredients of a logic called Public Announcement Logic (PAL), which formalizes these ideas and combines epistemic and dynamic logic .
We have seen that a public announcement of a proposition changes the current epistemic model with actual world as in Figure [fig2.7].

(190,80) (17,70) (35,66)

[Language ][def:05] We define the language inductively as follows:

Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle {\mathcal{L}_{PAL}}:\phi~~::=~~ p~\mid~(\neg\phi)~\mid~(\phi\land\phi)~\mid~\Box_j\phi~\mid~[\phi!]\phi}

where . The formula Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \langle K_{j}\rangle\alpha} is an abbreviation for Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \neg\Box_j\neg\alpha} and is an abbreviation for . The epistemic language is interpreted as in Definition [def:1.2.6], while the semantic clause for the new dynamic action modality is “forward looking” among models as follows:

where with , Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle R_j^\psi:=R_j\cap W^\psi\times W^\psi} for all and .

The formula intuitively means that after a truthful announcement of , holds. The formula intuitively means that the announcement is possible and after this announcement holds.

[Soundness and Completeness] The proof system defined below is sound and strongly complete for w.r.t. 

[ML] &
[Red1] &[!] pp,
[Red2] &[!]
[Red3] &[!]()
[Red4] &[!] K_i(K_i ())

Here is a typical calculation using the reduction axioms:

[q!]K q (qK(qq)
(qK(q(qq)))
(qK)

This states that after a public annoucement of , the agent knows that holds.

PAL is decidable, its model checking problem is solvable in polynomial time and its satisfiability problem is PSPACE-complete .

The General Case: Event Models and Product Update

In this section, we focus on items 2 and 3 of page , namely on how to represent events and on how to update an epistemic model with such a representation of events by means of a product update.

Representation of Events

The language was introduced in . The propositional letters describing events are called atomic events and range over ranges over Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{L}_{\textsf{EL}}\}} . The reading of is “an event of precondition is occurring”.

[Event Language Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{L}_\alpha} ][def:05] We define the language Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{L}_\alpha} inductively as follows:

where and . The formula Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \langle K_{j}\rangle\alpha} is an abbreviation for Failed to parse (Conversion error. Server ("https://wikimedia.org/api/rest_") reported: "Cannot get mml. upstream connect error or disconnect/reset before headers. reset reason: connection termination"): {\displaystyle \neg \Box _{j}\neg \alpha } .

A pointed event model represents how the actual event represented by is perceived by the agents. Intuitively, means that while the possible event represented by is occurring, agent considers possible that the possible event represented by is actually occurring.

[Event Model] An event model is a tuple where:

  • is a non-empty set of possible events,
  • is an accessibility relation on , for each ,
  • is a function assigning to each possible event a formula of . The function is called the precondition function.

We write for , and is called a pointed event model ( often represents the actual event). We denote by the set of pointed event models. denotes the set Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \{f\in W_\alpha\mid R_j ef\}} .

The truth conditions of the language are identical to the truth conditions of the language :

[Satisfaction Relation] Let be an event model, and Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \alpha\in\mathcal{L}_\alpha} . The satisfaction relation Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{E},e\models\alpha} is defined inductively as follows:

,ep_ I(e)=
,e ,e
,e ,e ,e
,e_j f_j(e), ,f

[Card Example] [ex] Let us resume Example [example9.1] and assume that players A and B show their card to each other. As it turns out, C noticed that A showed her card to B but did not notice that B did so to A. Players A and B know this. This event is represented in the event model of Figure [fig2].

The boxed possible event corresponds to the actual event ‘players A and B show their and cards respectively to each other’ (with precondition ), stands for the event ‘player A shows her card’ (with precondition ) and stands for the atomic event ‘player A shows her card’ (with precondition ). The following statement holds in the example of Figure [fig2]:

[for1] ,ep_ ( p_B_A p_)( p_B_B p_)
(p_p_B_C(p_p_))

It states that players A and B show their cards to each other, players A and B ‘know’ this and consider it possible, while player C considers possible that player A shows her card and also considers possible that player A shows her card, since he does not know her card. In fact, that is all that player C considers possible since he believes that either player A shows her card or her card. Another example of event model is given in Figure [fig4]. This second example corresponds to the event whereby Players A shows her card publicly to everybody.

The following statement holds in the example of Figure [fig4]:

,ep_ B_Ap_B_Bp_B_Cp_ B_ABelAp_B_AB_Bp_B_AB_Cp_ B_BB_Ap_
B_BB_Bp_B_BB_Cp_ B_CB_Ap_B_CB_Bp_B_CB_Cp_…

It states that player A shows her red card and that players A, B and C ‘know’ it, that players A, B and C ‘know’ that each of them ‘know’ it, etc…in other words, there is common knowledge among players A, B and C that player A shows her red card.

Update of the Initial Situation by the Event: Product Update

The DEL product update of is defined as follows. This update yields a new -model representing how the new situation which was previously represented by is perceived by the agents after the occurrence of the event represented by .

[Product update] [def:6] Let be an epistemic model and let be an event model. The product update of and is the epistemic model Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{M}\otimes\mathcal \mathcal{E}=(W^\otimes,\R^\otimes_1,\ldots,\R^{\otimes}_m,I^\otimes)} defined as follows: for all and all ,

  • ,
  • and ,
  • .

As a result of the event described in Example [ex], the agents update their beliefs. We get the situation represented in the epistemic model of Figure [fig3].

In this –model, we have for example the following statement: Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle (\mathcal{M},w)\otimes(\mathcal{E},e)\models ({\color{green}{B}}\land B_{A} {\color{green}{B}}) \land B_{C}\neg B_{A} {\color{green}{B}}.} It states that player A ‘knows’ that player B has the card but player C believes that it is not the case.

A General Language

Because of their limited perception of the surrounding world, human and artificial agents often need to reason on partial and incomplete descriptions of events and situations. For any agent, the behavior of other agents is often partially or completely unknown to them: the other agents may simply be out of sight for instance. For example, how can we be sure that an intruder does not know a certain piece of information after observing an exchange of messages in a group of agents if what we only know about him is that he was only able to read the messages limited to a certain vocabulary, or that he could only intercept and read the messages sent or received by a subgroup of agents? In general, we would be interested in expressing the following kind of formula: Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle [\alpha]\phi} , whose intuitive reading would be “ holds after the occurrence of an event such that what we only know about it is that it satisfies ”. This formula typically describes partially and incompletely the event occurring, although it could provide a full description of it as well.

[Language ] [def updte] The language is defined inductively as follows: Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \mathcal{L}_F:\phi~~::=~~ p~\mid~(\neg\phi)~\mid~(\phi\land\phi)~\mid~ B_j\phi~\mid~ [\alpha]\phi} where ranges over , ranges over and over Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle AGTS} . The formula is an abbreviation of the formula . Let be a pointed epistemic model. The truth conditions for the language are defined as in Definition [def:1.2.6], except for the operator :

cc ,w&

(,e) ,e,
,wPre(e) (,w)(,e)

A sound and complete proof system for can be found in . The following proposition shows that the logic of public anouncement logic Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \textsf{L}_{PAL}:=({\mathcal{L}_{PAL}},\mathcal{C}_{\textsf{ML}},\models)} is at least as expressive as our general logic , i.e. Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \textsf{L}_F\geq\textsf{L}_{PAL}} (see Definition [def:2.6.7]).

Let Failed to parse (SVG (MathML can be enabled via browser plugin): Invalid response ("Math extension cannot connect to Restbase.") from server "http://localhost:6011/en.wikipedia.org/v1/":): {\displaystyle \psi\in\mathcal{L}_{\textsf{EL}}} . Then, for all pointed epistemic models ,

  1. ^ A distinction is sometimes made between events and actions, an action being a specific type of event performed by an agent.
  2. ^ For example, assume that a university professor believes (is certain) that one of her colleague’s seminars is on Thursday (formally ). She is actually wrong because it is on Tuesday (). Therefore, she does not know that her colleague’s seminar is on Tuesday (). If we assume that axiom is valid then we should conclude that she knows that she does not know that her colleague’s seminar is on Tuesday () (and therefore she also believes that she does not know it: ). This is obviously counterintuitive.
  3. ^ In both philosophy and computer science, there is formalization of the internal point of view. Perhaps one of the dominant formalisms for this is Moore’s auto-epistemic logic . In philosophy, there are models of full belief like the one offered by , which is also related to ideas in auto-epistemic logic. In , I provide more details on the internal approach and its connection to the other modeling approaches, namely the imperfect and the perfect external approaches.