Knowledge Representation Chapter - 4: Compiled By: Bal Krishna Nyaupane

Download as pdf or txt
Download as pdf or txt
You are on page 1of 65

Knowledge Representation

Chapter – 4
Compiled By: Bal Krishna Nyaupane
[email protected]
1
Introduction to Knowledge
 Knowledge is a theoretical or practical understanding of a subject that helps us to solve
problems in particular domain, to predict what will happen next and to explain why and how
something has happened.
 Knowledge is "information combined with experience, context, interpretation, and reflection.
It is a high-value form of information that is ready to apply to decisions and actions." (T.
Davenport et al., 1998)
 Knowledge is “human expertise stored in a person’s mind, gained through experience, and
interaction with the person’s environment." (Sunasee and Sewery, 2002)
 Knowledge is “information evaluated and organized by the human mind so that it can be
used purposefully, e.g., conclusions or explanations." (Rousa, 2002)
 In general, knowledge is more than just data, it consist of: facts, ideas, beliefs, heuristics,
associations, rules, abstractions, relationships, customs.

2
Introduction to Knowledge
 Research literature classifies knowledge as follows:
Classification-based Knowledge Ability to classify information

Decision-oriented Knowledge Choosing the best option


Descriptive knowledge State of some world (heuristic)
Procedural knowledge How to do something
Reasoning knowledge What conclusion is valid in what situation?
Assimilative knowledge What its impact is?
 Key Issues of Knowledge for the designer of AI system are:
• Knowledge acquisition: Gathering the knowledge from the problem domain to solve the AI
problem.
• Knowledge representation: Expressing the identified knowledge into some knowledge
representation language such as propositional logic, predicate logic etc.
• Knowledge manipulation: Large volume of knowledge has no meaning until up to it is processed to
deduce the hidden aspects of it. Knowledge is manipulated to draw conclusions from
knowledgebase. 3
Knowledge Representation using Logic
 Logic is defined as a formal language for expressing knowledge and ways of reasoning.
 A logic is defined by the following:
• Syntax - describes the possible configurations that constitute sentences.
• Semantics - determines what facts in the world the sentences refer to i.e. the interpretation.
Each sentence makes a claim about the world.
• Proof theory - set of rules for generating new sentences that are necessarily true given that the
old sentences are true. The relationship between sentences is called entailment. The semantics
link these sentences (representation) to facts of the world. The proof can be used to determine
new facts which follow from the old.
• A set of sentences : A sentence is constructed from a set of primitives according to syntax rules.
• A set of interpretations – An interpretation gives a semantic to primitives. It associates
primitives with values.
• The valuation (meaning) function – Assigns a value (typically the truth value) to a given
sentence under some interpretation. sentence × interpretation →{True , False }
 Types of logic: Different types of logics are: Propositional logic and First-order logic (First –order
Predicate Logic).
4
Propositional logic
 A propositional logic is a declarative sentence which can be either true or false but not both or
either.
 Propositional logic is a mathematical model that allows us to reason about the truth or falsehood
of logical expression.
 Propositional Logic: Symbols (e.g., letters, words) are used to represent facts about the world
 Propositional symbols: P, Q, S, ... (atomic sentences)
 Wrapping parentheses: ( … )
 Sentences are combined by connectives:
  ...and [conjunction]
  ...or [disjunction]
 ...implies [implication / conditional]
 ..is equivalent [bi-conditional]
  ...not [negation]
 Literal: atomic sentence or negated atomic sentence
 Every sentence constructed with binary connectives must be enclosed in parenthesis.
5
Propositional logic
 Example of declarative sentences which are propositions.
1. Kathmandu is the capital of the Nepal. True proposition
2. Sun rises in west. False proposition
3. 5+ 1 = 6. True proposition
4. Who are you? Not proposition
 A sentence is defined as follows:
• A symbol is a sentence
• If S is a sentence, then S is a sentence
• If S is a sentence, then (S) is a sentence
• If S and T are sentences, then (S  T), (S  T), (S  T), and (S ↔ T) are sentences
• A sentence results from a finite number of applications of the above rules
 The order of precedence in propositional logic is:  , , , , 

6
Propositional logic
 Atomic sentences: It is single sentence where each symbol stands for proposition that can be
true or false. P means “It is hot.” Q means “It is humid.” R means “It is raining.”
 Complex sentences: It is constructed from simple sentences using logical connectives.
• (P  Q)  R ; “If it is hot and humid, then it is raining”
 A valid sentence or tautology is a sentence that is True under all interpretations, no matter what
the world is actually like or how the semantics are defined.
 An inconsistent sentence or contradiction is a sentence that is False under all interpretations.
The world is never like what it describes, as in “It’s raining and it’s not raining.”
  P is called the negation of P.
 P ^ Q is called the conjunction of P and Q.
 P  Q is called the disjunction of P and Q.
 P  Q is called a conditional or implication. P is referred to as the antecedent; Q as the
consequent.
 P  Q is called a biconditional or equivalence
7
Truth Tables in PL

8
Truth tables in PL

• Show that is a tautology

9
Equivalence Laws in PL

10
Inference Rule of PL
Rule of inference Tautology Name
pq
p [ p  ( p  q )]  q Modus ponens
q
q
pq [q  ( p  q )]  p Modus tollen
 p
pq
qr [( p  q )  (q  r )]  ( p  r ) Hypothetic al syllogism
pr
pq
p (( p  q )  p )  q Disjunctiv e syllogism
q
p
p  ( p  q) Addition
pq
pq
( p  q)  p Simplifica tion
p
p
q (( p )  (q ))  ( p  q ) Conjunctio n
pq
pq
p  r [( p  q )  (p  r )]  ( p  r ) Resolution
q  r 11
 Example 1:Ram is a OOP major. Therefore, Ram is either a OOP major or a DSA major.
P: Ram is a OOP major Q: Ram is a DSA major
P
------- addition
P V Q
 Example 2: Hari is a computer science major and a mathematics major. Therefore, Hari is a mathematics major.
P: Hari is a mathematics major Q: Ram is a computer science major
PQ
------- simplification
P
 Example 3:If it is snows today, we will go snow tubing. We didn't go snow tubing. Therefore, it
didn't snow today.
P: It is snows today Q: We will go snow tubing
PQ
 Q
------- Modus Tollens
P

12
 Example 4: Show that the hypotheses “Ram works hard,” “If Ram works hard, then he is a dull
boy,” and “If Ram is a dull boy, then he will not get the job” imply the conclusion “Ram will not
get the job.”
P: Ram works hard. Q: Ram is a dull boy. R: Ram will not get the job.
1. P
2. P  Q
3. Q  R
---------
4. Q from 1 and 2 by MP
5. R from 3 and 4 by MP
 Example 5: Use resolution to show that the hypotheses “Ram is skiing or it is not snowing” and
“It is snowing or Hari is playing hockey” imply that “Ram is skiing or Hari is playing hockey”.
Q: Ram is skiing. P: it is snowing. R: Hari is playing hockey
P V Q
PVR
------------------- Resolution
QVR

13
BNF Grammar
 Backus Normal (or Naur) form is a notation techniques for context free grammars often used to describe the
syntax of language used in computing.
 A BNF grammar can be used to two ways:
 To generate strings belonging to the grammar
 To recognize strings belonging to the grammar

14
First Order predicate Logic (FOPL)
 Propositional logic is a weak language
• Hard to identify “individuals” (e.g., Mary, 3)
• Can’t directly talk about properties of individuals or relations between individuals (e.g., “Bill
is tall”)
• Generalizations, patterns, regularities can’t easily be represented (e.g., “all triangles have 3
sides”)
• FOPL is expressive enough to concisely represent this kind of information
 Predicate logic is an extension of propositional logic with more expressive power.
 In propositional logic, each possible atomic fact requires a separate unique propositional
symbol.
 If there are n people and m locations, representing the fact that some person moved from one
location to another requires (nm2) separate symbols.
 Predicate logic includes a richer ontology: -objects (terms) -properties (unary predicates on
terms) -relations (n-ary predicates on terms) -functions (mappings from terms to other terms)
15
FOPL( First Order Predicate Logic)
 FOPC( First Order Predicate Calculus) is a logic that gives us ability to quantity over object.
 In FOPL ,statements from a natural language like English are translated into symbolic structure
composed of predicate, functions , variables, constants, quantifiers and language connectives.
 FOPL represents fact by separating classes and individuals and consider that world consists of
objects and relations.
 FOPL Elements
• Logical connectives :  , , , , ,()
• Constants Symbols: String that will be interpreted as representing objects. e.g. Ram ,Car.
• Variable Symbol : Used as a place holder for quantifying ever objects e.g. 𝑥, 𝑦, 𝑎, 𝑏.
• Predicate Symbol : Used to denote properties of objects and relationship among them e.g.
Brother, owns, > etc.
• Function Symbol: Maps the specified number of input object to objects. E.g. father of, best
friend, third quarter of, one more than, beginning of, …
16
Sentences in FOPL
 Atomic Sentence is simply a Predicate applied to a set of objects (no variable).
• Example1:Brother (Mahesh, Dinesh)
• Example2: Owns(Ram ,Car)
• Example3: Sold (Ram, car, Hari)
 Complex sentences are made from atomic sentences using connectives: S, S1  S2, S1  S2,
S1  S2, S1  S2
• Example1: Owns ( Ram ,car ) ⋁ owns (Hari ,car )
• Example2: Sold (Ram, car, Hari) => owns (Hari ,car)

17
FOPL Elements
 Quantifiers
• Each quantifier defines a variable for the duration of the following expression, and indicates
the truth of the expression.
 Universal quantifier “”
• It means “for all”
• The expression is true for every possible value of the variable.
• Syntax:  <variables> <sentence>
• x P is true in a model m iff P with x being each possible object in the model .
• Typically ” ” is the main connective with 
Example1: Everyone at HCOE is smart: x At(x , HCOE)  Smart(x)
Example2: Everyone likes McDonalds: x, likes(x, McDonalds)
Example3:All children like McDonalds:x, child(x)  likes(x, McDonalds)
• Common mistake: using “” as the main connective with .
Example: x At(x , HCOE)  Smart(x) means “Everyone is at HCOE and everyone is
smart”

18
FOPL Elements
 Existential quantifier “”
• It means “there exists”
• The expression is true for at least one value of the variable.
• Syntax : <variables> <sentence>
•  x P is true in a model m iff P with x being at least one possible object in the model.
• Typically, “” is the main connective with .
Example1: Someone at HCOE is smart:x At(x , HCOE)  Smart(x).
Example2:Someone likes McDonalds: x likes(x, McDonalds).
• Common mistake: using “” as the main connective with .
Example: x At(x, HCOE)  Smart(x) is true “if there is anyone who is not at HCOE”.

19
Quantifier Scope
 FOL sentences have structure, like  Switching order of universal quantifiers does
programs not change the meaning
 In particular, the variables in a sentence • (x)(y)P(x,y) ↔ (y)(x) P(x,y)
have a scope  Switching order of existential quantifiers does
 For example, suppose we want to say not change the meaning
• “everyone who is alive loves • (x)(y)P(x,y) ↔ (y)(x) P(x,y)
someone”  Switching order of universals and existential
• (x) alive(x)  (y) loves(x,y) does change meaning:
 Here’s how we scope the variables • Everyone likes someone: (x)(y) likes(x,y)
• Someone is liked by everyone: (y)(x)
likes(x,y)

20
Connections between All and Exists
 Examples
1. All dogs don’t like cats ↔ No dogs like cats
• x Dog(x)  Likes(x , Cat)
 We can relate sentences involving • (x Dog(x)  Likes(x , Cat))
 and  using De Morgan’s laws: 2. Not all dogs dance ↔ There is a dog that doesn’t dance
1. (x) P(x) ↔ (x) P(x) •  (x Dog(x)  Dance(x ) )
2. (x) P ↔ (x) P(x) • x Dog(x)   Dance(x )
3. (x) P(x) ↔  (x) P(x) 3. All dogs sleep ↔ There is no dog that doesn’t sleep
4. (x) P(x) ↔ (x) P(x) • x Dog(x)  Sleep(x )
•  x Dog(x)   Sleep(x )
4. There is a dog that talks ↔ Not all dogs can’t talk
• x Dog(x)  Talk(x )
•  x Dog(x)   Talk(x )
21
Example FOPL
1. Obama is not short : short(Obama)
2. All children like McDonalds : x, child(x)  likes(x, McDonalds)
3. Every gardener likes the sun : x gardener(x)  likes(x , Sun)
4. You can fool some of the people all of the time : x t person(x)  time(t)  can-fool(x, t)
5. You can fool all of the people some of the time : x (person(x)  t (time(t) can-fool(x, t))
6. All purple mushrooms are poisonous. : x (mushroom(x)  purple(x))  poisonous(x)
7. Cats rule and dogs drool : ∀a (Cat(a) ⇒ Rules(a)) ∧ (Dog(a) ⇒ Drools(a))
8. No employee can earn more than chairman: ∀x Employee(x) ⇒  EarnMore(x, Chairman)
9. Everyone student who is taking AI is cool: ∀x Student(x) ∧ Taking_AI(x) ⇒ Cool(x)
10. Everyone who eats apple is either homeless or a graduate student :
 ∀x Eats_apple(x) ⇒ Homeless(x) ∨ Grad_Student(x)
11. No purple mushroom is poisonous (two ways)
 x purple(x)  mushroom(x)  poisonous(x)
 x (mushroom(x)  purple(x))  poisonous(x) 22
Example FOPL
12. Everyone likes McDonalds unless they are allergic to it:
 x, allergic (x, McDonalds)  likes(x, McDonalds)
13. There is a person who loves everyone in the world. :x y Loves(x, y)
14. Everyone in the world is loved by at least one person. :y x Loves(x, y)
15. No one likes McDonalds. (x, likes(x, McDonalds)) OR x, likes(x, McDonalds)
16. Not everyone like McDonalds. (x, likes(x, McDonalds)) OR x, likes(x, McDonalds)
17. Someone at Stanford is smart. ∃x At(x, Stanford) ∧ Smart(x)
18. Some employee are sick today. x Employee (x)  Sick( x)
19. All employee earning Rs 300000 or more per year pay taxes.
  x Employee (x)  Earn_More( x, 300000) ⇒Pay_Tax(x)
20. Ram eats everything Sita eats. x Eats(Sita , x)  Eats(Ram , x)
21. There are no mushrooms that are poisonous and purple
 ∀x Mushroom(x) ⇒ ¬(Poisonous(x) ∧ Purple(x))
22. There is a mushroom that is purple and poisonous : ∃x Mushroom(x) ∧ Poisonous(x) ∧
Purple(x)
23
Example FOPL
23. Everyone likes some kind of food. y x, food(x)  likes(y, x)
24. There is a kind of food that everyone likes. x y, food(x)  likes(y, x)
25. Someone likes all kinds of food. y x, food(x)  likes(y, x)
26. Every food has someone who likes it. x y, food(x)  likes(y, x)
27. Shyam hates all people who don’t hate themselves.
 x Person (x)  Hates(x , x)  Hates(Shyam, x)
28. Everyone is married to exactly one person. x y Married(x ,y)  z Married(x ,z) (y=z)
29. Someone who hates something owned by another person will not love that person.
 x y z Owns(y ,z)  Hates(x , z)   Loves(x, y)
30. There is a barber in town who shaves all men in town who don’t shaves themselves.
 x Barber(x)  Intown(x )  y Men(y)  Intown(y) Shaves(y , y)  Shaves(x, y)
31. There are exactly two purple mushrooms
 x y mushroom(x)  purple(x)  mushroom(y)  purple(y) ^ (x=y)  z
(mushroom(z)  purple(z))  ((x=z)  (y=z))
24
Inference rule in FOPL
 Universal Instantiation
• Also known as Universal elimination
• For any sentence a, variable v, and ground term g:

• Notation: Subst({v/g}, α) means the result of substituting g for v in sentence α


• If (x) P(x) is true, then P(C) is true, where C is any constant in the domain of x.
• Example: If a person is an employee, works in Hillside and teaches AI then he/she is a
lecturer.
• In FOPL: x employee(x) Woks(x, Hillside)  Teaches(x, AI) Lecturer(x).
• We can Substitute x/Ram and Infer
• employee(Ram) Woks(Ram ,Hillside)  Teaches(Ram ,AI) Lecturer(Ram)

25
Inference rule in FOPL
 Existential Instantiation
• Also known as Existential elimination
• For any sentence α, variable v, and constant symbol k that does not appear elsewhere in the
knowledge base.

• From (x) P(x) infer P(c).


• Example: (x) eats(Ram, x) , We can Substitute x/Ram and infer eats(Ram, apple), as long
as Ram does not appear elsewhere in the knowledge base.
• All instances of the given constant symbol are replaced by the new variable symbol
• Note that the variable symbol cannot already exist anywhere in the expression.

26
Inference rule in FOPL
 Existential introduction
• For any sentence a, variable v that does not occur in a, and ground term g that does not occur in a

• If P(c) is true, then (x) P(x) is inferred.


• Example: eats(Ram, Ice-cream) , we can infer (x) eats(Ram, x)
• All instances of the given constant symbol are replaced by the new variable symbol
• Note that the variable symbol cannot already exist anywhere in the expression
 Propositionalization
• All people are honest. x people(x)  Honest(x)
people(Hari)  Honest(Hari)
 We can conclude : “Hari is honest”
• Someone at Stanford is smart. ∃x At(x, Stanford) ∧ Smart(x)
 At(Hari , Stanford) ∧ Smart(Hari).
27
Inference rule in FOPL
 Generalized Modus Ponens (GMP)  Unification
• Apply modus ponens reasoning to  It is a “pattern-matching” procedure
generalized rules. • Takes two atomic sentences, called literals,
• Combines And-Introduction, Universal- as input
Elimination, and Modus Ponens • Returns “Failure” if they do not match and
• E.g., from P(c) and Q(c) and (x)(P(x) ^ a substitution list if they do match.
Q(x)) => R(x) derive R(c)  That is, unify(p,q) = θ means subst(θ, p) =
• Example: x Student(x) Studied_Hard(x) subst(θ, q) for two atomic sentences, p and q
Topper(x)  All variables in the given two literals are
• We can write implicitly universally quantified
• Student(Sunil)  To make literals match, replace (universally
• Studied_Hard(Sunil) quantified) variables by terms
• Conclusion: Topper(Sunil)

28
Inference rule in FOPL
 Example:  Example:
parents(x, father(x), mother(Ram)) knows(Sita, x)
parents(Ram, father(Ram), y)  knows(Sita , Rita)
{x/Ram, y/mother(Ram)} yields That means: x=Rita
parents(Ram, father(Ram), mother(Ram))  Example:
 Example: knows(Sita, x)
parents(x, father(x), mother(Ram))  knows(y , Rita)
parents(Ram, father(y), z) That means: x=Rita & y=Sita
{x/Ram, y/Ram, z/mother(Ram)} yields  Example:
parents(Ram, father(Ram), mother(Ram)) knows(Sita, x)
 Example:  knows(y , Is_mother(y))
parents(x, father(x), mother(Hari)) That means: y=Sita & x=Is_mother(Sita)
parents(Ram, father(y), mother(y))
Failure 29
Resolution
 Resolution is a sound and  Process of Resolution
complete inference procedure for 1) Convert all sentences to CNF
FOL 2) Negate conclusion “S” & convert result to CNF
 It is based on proof by 3) Add negated conclusion “S” to the premise clauses
contradiction and usually called
resolution refutation. 4) Repeat until contradiction or no progress is made:
 The general technique is to add the 1) Select 2 clauses (call them parent clauses)
negation of the sentence to be 2) Resolve them together, performing all required
proven to the KB and see if this unifications
leads to a contradiction. 3) If resolvent is the empty clause, a contradiction has
 Resolution can be applied to been found (i.e., S follows from the premises)
sentences in CNF(Conjunctive 4) If not, add resolvent to the premises
Normal Form)  If we succeed in Step 4, we have proved the conclusion

30
Converting sentences to CNF
1. Eliminate of all ↔ connectives with equivalence symbol
(P ↔ Q)  ((P  Q) ^ (Q  P))
2. Eliminate all  connectives with equivalence symbol
(P  Q)  (P  Q)
3. Move  inwards (De Morgan’s Law)
P  P
(P  Q)  P  Q
(P  Q)  P  Q
(x)P  (x)P
(x)P  (x)P
(x)P  (x)P
4. Standardize variables: rename all variables so that each quantifier has its own unique variable
name

31
Converting sentences to CNF
5. Skolemization: The process of eliminating the existential quantifier through a substitution
process.
(x)P(x)  P(c)
c is a Skolem constant (a brand-new constant symbol that is not used in any other sentence)
(x)(y)P(x,y) becomes (x)P(x, F(x))
since  is within the scope of a universally quantified variable, use a Skolem function F to
construct a new value that depends on the universally quantified variable
F must be a brand-new function name not occurring in any other sentence in the KB.
E.g., (x)(y)loves(x,y) becomes (x)loves(x, F(x))
In this case, F(x) specifies the person that x loves

32
Converting sentences to CNF
6. Drop universal quantifiers
1) moving them all to the left end;
2) making the scope of each the entire sentence; and
3) dropping the “prefix” part
Ex: (x)P(x)  P(x)
7. Distributive  and 
(P  Q)  R  (P  R)  (Q  R)
(P  Q)  R  (P  Q  R)
8. Split conjuncts into separate clauses
9. Standardize variables so each clause contains only variable names that do not occur in any
other clause

33
Example 1: CNF Conversion
(x)(P(x)  ((y)(P(y)  P(F(x,y)))  (y)(Q(x,y)  P(y))))
2. Eliminate 
(x)(P(x)  ((y)(P(y)  P(F(x,y)))  (y)(Q(x,y)  P(y))))
3. Reduce scope of negation
(x)(P(x)  ((y)(P(y)  P(F(x,y))) (y)(Q(x,y)  P(y))))
4. Standardize variables
(x)(P(x)  ((y)(P(y)  P(F(x,y))) (z)(Q(x,z)  P(z))))
5. Eliminate existential quantification
(x)(P(x) ((y)(P(y)  P(F(x,y))) (Q(x,G(x))  P(G(x)))))
6. Drop universal quantification symbols
(P(x)  ((P(y)  P(F(x,y))) (Q(x,G(x))  P(G(x)))))

34
Example 1: CNF Conversion
7. Convert to conjunction of disjunctions
(P(x)  P(y)  P(F(x,y)))  (P(x)  Q(x,G(x))) 
(P(x)  P(G(x)))
8. Create separate clauses
P(x)  P(y)  P(F(x,y))
P(x)  Q(x,G(x))
P(x)  P(G(x))
9. Standardize variables
P(x)  P(y)  P(F(x,y))
P(z)  Q(z,G(z))
P(w)  P(G(w))
Note: Now that quantifiers are gone, we do need the upper/lower-case distinction
35
Example 2: CNF Conversion

36
Example 2: CNF Conversion

37
Resolution Example
 All people who are graduating are happy. All •  smiling (Kiran)
happy people smile. Kiran is graduating. Is 2. Move  inwards ( not needed)
Kiran smiling? 3. Standardize variables apart.
 First convert to predicate logic • x  graduating(x)  happy(x)
• x graduating(x)  happy(x) • y  happy(y)  smiling (y)
• x happy(x)  smiling (x) • graduating(Kiran)
• graduating(Kiran) •  smiling (Kiran)
• smiling (Kiran) 4. Skolemize. (not needed)
 Negate this:  smiling (Kiran) 5. Drop all x
 Convert to CNF a)  graduating(x)  happy(x)
1. Eliminate  b)  happy(y)  smiling (y)
• x  graduating(x)  happy(x) c) graduating(Kiran)
• x  happy(x)  smiling (x) d)  smiling (Kiran)
• graduating(Kiran) 38
Resolution Example
6. Distribute ^ over _. (not needed)
7. Make each conjunct a separate clause. (not needed)
8. Standardize the variables apart again. (not needed)

 Ready for resolution!


• Resolve 4 and 2: We can conclude that
• 5.  happy(Kiran)
• Resolve 5 and 1 :We can conclude that
• 6.  graduating(Kiran)
• Resolve 6 and 3:
•┴

39
Resolution Example-2
 Shyam owns a dog. Every dog owner is an animal lover. No animal lover kills an animal. Either
Shyam or Kamal killed the cat, who is named Tuna. Did Kamal kill the cat?
 FOPL represented as follows:
A. (x) (Dog(x)  Owns(Shyam , x))
B. (x) (((y) (Dog(y)  Owns(x, y)))  AnimalLover(x))
C. (x) (AnimalLover(x)  ((y) Animal(y)  Kills(x,y)))
D. Kills(Shyam,Tuna)  Kills(Kamal,Tuna)
E. Cat(Tuna)
F. (x) (Cat(x)  Animal(x) )
G. Kills(Kamal, Tuna) GOAL

40
Resolution Example-2
 Convert to clause form
A1. (Dog(D))
A2. (Owns(Shyam, D)) D is a skolem constant
B. (Dog(y), Owns(x, y), AnimalLover(x))
C. (AnimalLover(a), Animal(b), Kills(a,b))
D. (Kills(Shyam,Tuna), Kills(Kamal,Tuna))
E. Cat(Tuna)
F. (Cat(z), Animal(z))
 Add the negation of query:
G: (Kills(Kamal, Tuna))

41
Resolution Example-2
 The resolution refutation proof
• R1: G, D, {} (Kills(Shyam, Tuna))
• R2: R1, C, {a/Shyam, b/Tuna} (~AnimalLover(Shyam), ~Animal(Tuna))
• R3: R2, B, {x/} (~Dog(y), ~Owns(Shyam, y), ~Animal(Tuna))
• R4: R3, A1, {y/D} (~Owns(Shyam, D), ~Animal(Tuna))
• R5: R4, A2, {} (~Animal(Tuna))
• R6: R5, F, {z/Tuna} (~Cat(Tuna))
• R7: R6, E, {} FALSE

42
Resolution Example-2:The proof tree
Resolution Example-3

44
Resolution Example-3

45
46
Well-formed formula

 Well-formed formula in propositional logic is defined as:


• An atom is a well-formed formula.
• If  is a well-formed formula, then ~ is a well-formed formula.
• If  and  are well formed formulae, then (  ), ( V  ), (  ),
(   ) are also well-formed formulae.
• A propositional expression is a well-formed formula if and only if it can
be obtained by using above conditions.

47
Well-formed formula
 Well-Formed Formula for First Order Predicate Logic
• Those which produce a proposition when their symbols are interpreted must follow the rules
given below, and they are called wffs of the first order predicate logic.
• A predicate name followed by a list of variables such as P(x, y), where P is a predicate name,
and x and y are variables, is called an atomic formula.
• Wffs are constructed using the following rules:
• True and False are wffs.
• Each propositional constant (i.e. specific proposition), and each propositional variable (i.e. a
variable representing propositions) are wffs.
• Each atomic formula (i.e. a specific predicate with variables) is a wff.
• If A, B, and C are wffs, then so are A, (A B), (A VB), (A B), and (A B)
• If x is a variable (representing objects of the universe of discourse), and A is a wff, then so
are  x A and  x A .
48
Horn clauses
 A Horn clause is a clause (a disjunction of literals) with at most one positive, i.e. unnegated,
literal.
 Clause with exactly one positive literals is called definite clause (Fact).
 A Horn clause with exactly one positive literal is a definite clause; a definite clause with no
negative literals is sometimes called a fact; and a Horn clause without a positive literal is
sometimes called a goal clause.
 Three kinds of Horn clauses are illustrated in the following propositional example:

49
Horn Clauses
 Reasons for being important:
 Every horn clauses can be written as an implication whose premise is a conjunction of
positive literals and whose conclusion is a single positive literals.
E.g. P1 ∨ P2 ∨ R can be written as P1  P2 => R
 Inference with horn clauses can be done with the forward and backward Chaining.
 Deciding entailment with horn clauses can be done in time that is linear in the size of
Knowledge base.

50
Rule-based Systems
 A rule based system is also called a production system.
 Rule-based systems (RBS) provide automatic problem solving tools for capturing the human
expertise and decision making.
 RBS are means for codifying the problem solving of human experts.
 Experts typically express most of their problem solving techniques in terms of antecedent-
consequent rules.
 The main properties of rule-based systems are:
• They incorporate practical human knowledge in if-then rules;
• Their skill increases proportionally to the enlargement of their knowledge;
• They can solve a wide range of potentially complex problems by selecting relevant rules and
then combining the results;
• They adaptively determine the best sequence of rules to examine;
• They explain their conclusions by retracting their lines of reasoning
51
Rule-based Systems
 A production rule is an:  Advantages of Rule Based Systems
IF situation THEN action • Modularity: Each rule is a separate unit. This makes
IF premise THEN conclusion adding, editing or removing of rules easily possible
IF antecedent THEN consequent giving great flexibility to the system.
 Rule-based systems are the most • Uniformity: The same format is used for representing
popular type of expert systems. all of the knowledge.
 Two inference methods are used in • Naturalness: In many domains rules are used to
rule-based systems express the knowledge.
 Forward reasoning /Forward
 Disadvantages of Rule Based Systems
chaining
 Backward reasoning /Backward
• Infinite Chaining
chaining • Addition of new contradictory knowledge
• Modification of existing knowledge
• Large number of rules needed to cover some domains
(e.g. air traffic control)
52
Forward Chaining
 Forward chaining is one of the two main methods of reasoning when using inference rules
 Described logically as repeated application of modus ponens.
 Forward chaining is a popular implementation strategy for expert systems, business and
production rule systems.
 Forward chaining starts with the available data and uses inference rules to extract more data
until a goal is reached.
 An inference engine using forward chaining searches the inference rules until it finds one where
the antecedent (If clause) is known to be true. When such a rule is found, the engine can
conclude, or infer, the consequent (Then clause), resulting in the addition of new information to
its data.
 Inference engines will iterate through this process until a goal is reached.
 Because the data determines which rules are selected and used, this method is called data-
driven.

53
Forward Chaining
 The forward chaining approach is often employed by expert systems, such as CLIPS.
 One of the advantages of forward-chaining over backward-chaining is that the reception
of new data can trigger new inferences, which makes the engine better suited to dynamic
situations in which conditions are likely to change.
 It’s a popular strategy of reasoning In expert system and production rule system.
 Start from a set of facts (data available) and check to see if the premises of any rules are
satisfied. If there is a match then the rule fires (is executed). The steps followed in
forward chaining are:
I. Matching: Compare rules with known facts and find rules that are satisfied.
II. Conflict Resolution: More than one rule may be satisfied. Conflict resolution is the
process of selecting the one with highest priority for execution.
III.Execution: The rule selected is executed (fired). This may result in a new fact(s) to
be added and the process continues forward.
54
Forward Chaining
 Example: Animal identification system
if X croaks and eats flies – then X is a frog
IF X chirps and sings –then X is a canary
IF X is a frog -then X is green
IF X is canary- then X is a yellow
 Goal: color of a pet given that he croaks and eat flies.
 The rule base would be reached and the first rule would be selected because the antecedent
matched (say).
 Now the consequent (Hen X is a frog ) is added to the data.
 The rule base is again reached, thus line third will be selected
 Now new consequent (then X is green ) is added to data
 Nothing more can be inferred from this information.
55
Backward Chaining
 It is also called goal driven reasoning.
 Backward chaining (or backward reasoning) is an inference method that can be described as working
backward from the goal(s).
 It is used in automated theorem provers, proof assistants and other artificial intelligence applications.
 Rules are based on the modus ponens inference rule.
 It is one of the two most commonly used methods of reasoning with inference rules and logical
implications.
 Backward chaining systems usually employ a depth-first search strategy.
 Backward chaining starts with a list of goals (or a hypothesis) and works backwards from the consequent
to the antecedent to see if there is data available that will support any of these consequents.
 An inference engine using backward chaining would search the inference rules until it finds one which
has a consequent (Then clause) that matches a desired goal.
 If the antecedent (If clause) of that rule is not known to be true, then it is added to the list of goals.
 Because the list of goals determines which rules are selected and used, this method is called goal-driven,
in contrast to data-driven forward-chaining inference. The backward chaining approach is often
employed by expert systems.
56
Forward Vs. Backward Chaining

57
Backward Chaining
The following steps are followed in backward reasoning:
1. Form a stack consisting of all the possible goals.
2. Try to prove the first goal on the top of stack. Find all the rules that can satisfy this goal.
3. Examine the premises of each rule
I. if all premises of the rule are satisfied then fire the rule and remove the
corresponding goal from the stack and then go to step 2.
II. if one or more of the premises can not be satisfied using the current facts then find
rules that can satisfy the premise and put their action parts as sub-goals on the top of
the stack. And go to step 2.
III. if no rules can be found in II then ask the user and add that to the working memory if
the value given by the user satisfies the premise otherwise try the next rule.
4. If all the rules that can satisfy the goal have failed then remove the goal from the stack
and go to 2. If stack empty then stop.

58
Backward Chaining Example
 Considering the same example of Forward chaining
 The rule base is reached and 3rd & 4th rules would be selected because their
consequent units (then X is green the X is yellow ) match to goal (find the color).
 Since it is not known that pet is frog so both the antecedents are added to goal list.
 The rule base is again searched and this time first two rules are selected since their
consequents (then X is a frog, then X is a canary) matches the new goals that were
just added to the list.
 The antecedent (if X croaks and eats flies) known to be true and there fore it can
be concluded that pet is a frog not canary.
 The goal of determining the pet’s color is achieved since it knows color is green if
X is frog and color is yellow if X is a canary.

59
Bayesian Networks
 Bayesian Networks have its roots in Bayes’ theorem.
 It is a probabilistic graphical model that represents a set of random variables and their
conditional dependencies via a directed acyclic graph(DAG).
 Bayesian networks consist of:
• a DAG, whose edges represent relationships among random variables that are often (but
not always) causal;
• the prior probability distribution of every variable that is a root in the DAG;
• the conditional probability distribution of every non-root variable given each set of values
of its parents.
 Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated
probabilities.
 Useful for many AI problems
• Diagnosis ,Expert systems, Planning, and Learning
60
BAYES' FORMULA

 A variant of Bayes' formula to reason about probability of hypothesis H given evidence E in


presence of background knowledge B:

p( E | H  B )
p( H | E  B )  p( H | B )
p( E | B )
61
62
63
Assignment
 All married employees earning Rs. 3,00,000.00 or more per year in Nepal pay taxes. All
unmarried employees earning Rs. 2,50,000.00 or more per year in Nepal pay taxes. The
university professor of Nepal earns Rs 6,00,000.00 and has to pay 25% tax. No other
employees earn more than professor in the university. Some of the Nepalese citizen earn less
than Rs 500 per day and they only have to pay 1% service tax. Write in FOPL.
 Inference in FOPL
 If x is a parent of y, then x is older than y. If x is the mother of y, then x is a parent of y. Sita
is a mother of Shyam. Conclusion : Sita is older than Shyam.
 Mapping to FOPL
x y Parent(x ,y)  Older(x ,y)
x y Mother(x ,y)  Parent(x ,y)
Mother(Sita, Shyam)
 Conclusion: Older((Sita, Shyam)

64
Resolution: Assignment
1. All oversmart person are stupid. Children of oversmart peoples are naughty. Ram is children
of Hari. Hari is oversmart. Show that ram is naughty. Used FOPL based resolution method.
2. Marcus was a man. Marcus was a Pompeian. All Pompeians are Romans. Caesar was a ruler.
All Romans are either loyal to Caesar or hated him. Everyone is loyal to someone. People only
try to kill rulers they are not loyal to. Marcus tried to kill Caesar. Find , did Marcus Caesar?
3. Given Premises: If X is on top of Y, Y support X. If X is above Y and they are touching each
other, X is on top of Y. Everything is on top of another thing. A cup is above the book. A cup is
touching a book. Is the book supporting the cup? Used FOPL based resolution method.
4. John likes all kinds of food. Apples are food. Chicken is food. Anything anyone eats and isn’t
killed by is food. Bill eats peanuts, and is still alive. Sue eats everything that Bill eats. Prove
that John likes peanuts using resolution refutation.

65

You might also like