AI Unit-3 ppt

Download as pdf or txt
Download as pdf or txt
You are on page 1of 84

Artificial Intelligence

Topic: Knowledge Representation & Reasoning


by
Dr. Vikas Goel
Department of IT
KIET Group of Institutions
Knowledge-Based Agent
o An intelligent agent needs knowledge about the real world for taking decisions and reasoning to act efficiently.

o Knowledge-based agents are those agents who have the capability of -

o maintaining an internal state of knowledge (Knowledge Base)


o reason over that knowledge (Inference Engine)
o update their knowledge after observations and take actions(Learning)

o Knowledge-based agents are composed of two main parts:

o Knowledge-base

o Inference system.
Inference system
• Inference means deriving new sentences from old.
• Inference system allows us to add a new sentence to the knowledge
base.
• A sentence is a proposition about the world.
• Inference system applies logical rules to the KB to deduce new
information.
• An inference system works mainly in two rules which are given as:
• Forward chaining
• Backward chaining
A generic knowledge-based agent

TELL: This operation tells the knowledge base what it perceives


from the environment.
ASK: This operation asks the knowledge base what action it
should perform.
Perform: It performs the selected action.
Various levels of knowledge-based agent
• Knowledge level is the first level of knowledge-based agent,
• In this level, we need to specify what the agent knows, and what the agent goals
are.
• With these specifications, we can fix its behavior.
• For example, suppose an automated taxi agent needs to go from a station A to
station B, and he knows the way from A to B, so this comes at the knowledge
level.
The logical level
• At this level, we understand that how the knowledge representation of
knowledge is stored.
• At this level, sentences are encoded into different logics.
• At the logical level, an encoding of knowledge into logical sentences occurs.
• At the logical level we can expect to the automated taxi agent to reach to
the destination B.
The implementation level
• This is the physical representation of logic and knowledge.
• At the implementation level agent perform actions as per logical and
knowledge level.
• At this level, an automated taxi agent actually implement his
knowledge and logic so that he can reach to the destination.
Knowledge Representation
 Knowledge representation and reasoning (KR, KRR) is the part of Artificial intelligence which
concerned with AI agents thinking and how thinking contributes to intelligent behaviour of
agents.
 It is responsible for representing information about the real world so that a computer can
understand and can utilize this knowledge to solve the complex real world problems such as
diagnosis a medical condition or communicating with humans in natural language.
 It is also a way which describes how we can represent knowledge in artificial intelligence.
 Knowledge representation is not just storing data into some database, but it also enables an
intelligent machine to learn from that knowledge and experiences so that it can behave
intelligently like a human.
What to Represent:
Following are the kind of knowledge which needs to be represented in AI systems:
o Object: All the facts about objects in our world domain. E.g., Guitars contains strings,
trumpets are brass instruments.
o Events: Events are the actions which occur in our world.
o Performance: It describe behaviour which involves knowledge about how to do things.
o Meta-knowledge: It is knowledge about what we know.
o Facts: Facts are the truths about the real world and what we represent.
o Knowledge-Base: The central component of the knowledge-based agents is the
knowledge base. It is represented as KB. The Knowledge-base is a group of the Sentences
(Here, sentences are used as a technical term and not identical with the English language).
Types of knowledge
1. Declarative Knowledge:
• Declarative knowledge is to know about something.
• It includes concepts, facts, and objects.
2. Procedural Knowledge
• Procedural knowledge is a type of knowledge which is responsible for knowing how to do
something.
• It includes rules, strategies, procedures, agendas, etc.
3. Meta-knowledge:
• Knowledge about the other types of knowledge is called Meta-knowledge.
4. Heuristic knowledge:
• Heuristic knowledge is representing knowledge of some experts in a field or subject.
• Heuristic knowledge is rules of thumb based on previous experiences, awareness of
approaches, and which are good to work but not guaranteed.
5. Structural knowledge:
• It describes relationships between concepts or objects such as kind of, part of, and grouping of
something.
AI knowledge cycle
An Artificial intelligence system has the following components for
displaying intelligent behavior:
• Perception
• Learning
• Knowledge Representation and Reasoning
• Planning
• Execution
Approaches to knowledge representation
• Simple relational knowledge
• It is the simplest way of storing facts which uses the relational method, and
each fact about a set of the object is set out systematically in columns.

Player Weight Age


Player1 65 23
Player2 58 18
Player3 75 24

• Inheritable knowledge
• All data must be stored into a hierarchy of classes.
• Inferential knowledge
• Inferential knowledge approach represents knowledge in the form of formal
logics.
• Example: 1. Marcus is a man. 2.All men are mortal
man(Marcus)
∀x = man (x) ----------> mortal (x)s
• Procedural knowledge
• Procedural knowledge approach uses small programs and codes which
describes how to do specific things, and how to proceed.
• In this approach, one important rule is used which is If-Then rule.
• In this knowledge, we can use various coding languages such as LISP
language and Prolog language.
Techniques of knowledge representation

• There are mainly four ways of knowledge representation which are


given as follows:
• Logical Representation

• Semantic Network Representation

• Frame Representation

• Production Rules
Logical Representation
• Logical representation is a language with some concrete rules which deals with propositions and
has no ambiguity in representation.
• Logical representation means drawing a conclusion based on various conditions.
• This representation lays down some important communication rules.
• It consists of precisely defined syntax and semantics which supports the sound inference.
• Each sentence can be translated into logics using syntax and semantics.
Syntax:
• Syntaxes are the rules which decide how we can construct legal sentences in the logic.
• It determines which symbol we can use in knowledge representation.
• How to write those symbols.
Semantics:
• Semantics are the rules by which we can interpret the sentence in the logic.
• Semantic also involves assigning a meaning to each sentence.
• Logical representation can be categorised into mainly two logics:
• Propositional Logics
• Predicate logics
Propositional logic in Artificial intelligence

• Propositional logic (PL) is the simplest form of logic where all the
statements are made by propositions.
• A proposition is a declarative statement which is either true or false.
• It is a technique of knowledge representation in logical and
mathematical form.
• Example:
• a) It is Sunday.
• b) The Sun rises from West (False proposition)
• c) 3+3= 7(False proposition)
• d) 5 is a prime number.
Following are some basic facts about propositional logic:
• Propositional logic is also called Boolean logic as it works on 0 and 1 / true
or false.
• In propositional logic, we use symbolic variables to represent the logic,
such A, B, C, P, Q.
• Propositional logic consists of an object, relations or function, and logical
connectives.
• These connectives are also called logical operators.
• Connectives can be said as a logical operator which connects two
sentences.
• A proposition formula which is always true is called tautology, and it is also
called a valid sentence.
• A proposition formula which is always false is called Contradiction.
• Statements which are questions, commands, or opinions are not
propositions such as "Where is Rohini", "How are you", "What is your
name", are not propositions.
Syntax of propositional logic:
• The syntax of propositional logic defines the allowable sentences for the
knowledge representation. There are two types of Propositions:
• Atomic Propositions
• Compound propositions
• Atomic Proposition are the simple propositions. It consists of a single
proposition symbol. These are the sentences which must be either true or
false.
• Example:
• a) 2+2 is 4, it is an atomic proposition as it is a true fact.
• b) "The Sun is cold" is also a proposition as it is a false fact.
• Compound proposition are constructed by combining simpler or atomic
propositions, using parenthesis and logical connectives.
• Example:
• a) "It is raining today, and street is wet."
• b) "Ankit is a doctor, and his clinic is in Mumbai."
Logical Connectives:
• We can create compound propositions with the help of logical connectives. There
are mainly five connectives, which are given as follows:
• Negation: A sentence such as ¬ P is called negation of P. A literal can be either
Positive literal or negative literal.
• Conjunction: A sentence which has ∧ connective such as, P ∧ Q is called a
conjunction.
• Example: Rohan is intelligent and hardworking. It can be written as,
P= Rohan is intelligent, Q= Rohan is hardworking. P∧ Q.
• Disjunction: A sentence which has ∨ connective, such as P ∨ Q. is called
disjunction, where P and Q are the propositions.
• Example: "Ritika is a doctor or Engineer",
Here P= Ritika is Doctor. Q= Ritika is Engineer so we can write it as P ∨ Q.
• Implication: A sentence such as P → Q, is called an implication. Implications are
also known as if-then rules. It can be represented as
If it is raining, then the street is wet.
Let P= It is raining, and Q= Street is wet, so it is represented as P → Q
• Biconditional: A sentence such as P⇔ Q is a Biconditional sentence, example If I
am breathing, then I am alive
P= I am breathing, Q= I am alive, it can be represented as P ⇔ Q.
Thank
You
Artificial Intelligence
Topic: Theory of First order Logic
by
Dr. Vikas Goel
Department of IT
KIET Group of Institutions
Predicate logic / First Order Logic (FOL)
o First-order logic is another way of knowledge representation in artificial intelligence. It
is an extension to propositional logic.
o FOL is sufficiently expressive to represent the natural language statements in a concise
way.
o First-order logic is also known as Predicate logic.
o First-order logic is a powerful language that develops information about the objects in
a more easy way and can also express the relationship between those objects.
o First-order logic (like natural language) does not only assume that the world
contains facts like propositional logic but also assumes the following things in
the world:
o Objects: A, B, people, numbers, colours, wars, theories, squares…….
o Relations: It can be unary relation such as: red, round, is adjacent, or n-any
relation such as: the sister of, brother of, has color, comes between
o Function: Father of, best friend, third inning of, end of, ......
o As a natural language, first-order logic also has two
main parts:
o Syntax
o Semantics
Basic Elements of First-order logic:
Constant 1, 2, A, John, Mumbai, cat,....

Variables x, y, z, a, b,....

Predicates Brother, Father, >,....

Function sqrt, LeftLegOf, ....

Connectives ∧, ∨, ¬, ⇒, ⇔

Equality ==

Quantifier ∀, ∃
First Order Logic
 First-order logic statements can be divided into two parts:
Subject: Subject is the main part of the statement.
Predicate: A predicate can be defined as a relation, which binds two atoms together in
a statement.

 Consider the statement: "x is an integer.", it consists of two parts, the first part x is the
subject of the statement and second part "is an integer," is known as a predicate.

 Atomic sentences:
o Atomic sentences are the most basic sentences of first-order logic. These sentences are
formed from a predicate symbol followed by a parenthesis with a sequence of terms.
o We can represent atomic sentences as Predicate (term1, term2, ......, term n). Integer(X)

• Example: Ravi and Ajay are brothers: => Brothers(Ravi, Ajay).


Chinky is a cat: => cat (Chinky).
 Complex Sentences:
o Complex sentences are made by combining atomic sentences using connectives.
Quantifiers in First-order logic
A quantifier is a language element which generates quantification, and quantification
specifies the quantity of specimen in the universe of discourse.
These are the symbols that permit to determine or identify the range and scope of the
variable in the logical expression. There are two types of quantifier:
Universal Quantifier, (for all, everyone, everything)
Existential quantifier, (for some, at least one).
Universal Quantifier
Universal quantifier is a symbol of logical representation, which specifies that the statement within its range is
true for everything or every instance of a particular thing.
The Universal quantifier is represented by a symbol ∀, which resembles an inverted A.
• If x is a variable, then ∀x is read as:
For all x
For each x
For every x.
• Example:
All man drink coffee.
Let a variable x which refers to a cat so all x can be represented in UOD.
 ∀x man(x) → drink (x, coffee).
 It will be read as: There are all x where x is a man
who drink coffee.

Note: In universal quantifier we use implication "→".


Existential Quantifier
Existential quantifiers are the type of quantifiers, which express that the statement within its scope is true for
at least one instance of something.
It is denoted by the logical operator ∃, which resembles as inverted E. When it is used with a predicate variable
then it is called as an existential quantifier.
• If x is a variable, then existential quantifier will be ∃x or ∃(x). And it will be read as:
There exists a 'x.'
For some 'x.'
For at least one 'x.'
• Example:
Some boys are intelligent.
∃x: boys(x) ∧ intelligent(x)
It will be read as:
There are some x where x is a boy who is intelligent.

Note: In Existential quantifier we always use AND or


Conjunction symbol (∧).
Properties of Quantifiers:

o In universal quantifier, ∀x∀y is similar to ∀y∀x.

o In Existential quantifier, ∃x∃y is similar to ∃y∃x.

o ∃x∀y is not similar to ∀y∃x.


Some Examples of FOL using quantifier
1. All birds fly.
In this question the predicate is "fly(bird)."
And since there are all birds who fly so it will be represented as follows.
∀x bird(x) →fly(x).
2. Every man respects his parent.
In this question, the predicate is "respect(x, y)," where x=man, and y= parent.
Since there is every man so will use ∀, and it will be represented as follows:
∀x man(x) → respects (x, parent).
3. Some boys play cricket.
In this question, the predicate is "play(x, y)," where x= boys, and y= game. Since
there are some boys so we will use ∃, and it will be represented as:
∃x boys(x) ∧ play(x, cricket).
Practice Questions:
Translate above into predicate logic
1. John likes all kinds of food likes(John, food)
∀ x : Food(x) → Likes (John , x)
2. Apples are food
Food (Apple)
3. Bill eats peanuts and is still alive. Eats(Bill, peanuts)
Eats ( Bill , Peanuts ) Λ alive (Bill ) alive(Bill)
Practice Questions:
Convert the following sentences into wff of Predicate Logic ( First order logic).
(i) Ruma dislikes children who drink tea.
∀ x: child(x) Λ DrinkTea (x) →Dislikes ( Ruma, x)
(ii) Any person who is respected by every person is a king.
ⱻ x ∀ y : Person (y) Λ Respects( y , x) → King (x)
Thank
You
Knowledge Representation &
Reasoning
INFERENCE IN FOL
by
Dr. Vikas Goel
Department of IT
KIET Group of Institutions
Contents
• INTRODUCTION
• INFERENCE RULES
• CONJUNCTIVE NORMAL FORMULA
Introduction
• In artificial intelligence, we need intelligent computers which can
create new logic from old logic or by evidence, so generating the
conclusions from evidence and facts is termed as Inference.
• Inference rules are the templates for generating valid arguments.
• Inference rules for PL apply to FOL as well.
• New (sound) inference rules for use with quantifiers:
1. Universal Generalization
• Universal generalization is a valid inference rule which states that if
premise P(c) is true for any arbitrary element c in the universe of
discourse, then we can have a conclusion as ∀ x P(x).
• It can be represented as: .
• This rule can be used if we want to show that every element has a
similar property.
• In this rule, x must not appear as a free variable.
• Example: Let's represent, P(c): "A byte contains 8 bits", so for ∀ x
P(x) "All bytes contain 8 bits.", it will also be true.
2. Universal Elimination
• If (∀x)P(x) is true, then P(c) is true, where c is a constant in the
domain of x.
• IF "Every person like ice-cream"=> ∀x Person(x) => like(X, Ice cream)
so we can infer that
"John likes ice-cream" => P(c) Person(John) =>likes (Person, Ice
Cream)
• For example, from (∀x)eats(Ziggy, x) we can infer eats(Ziggy,
IceCream).
• The variable symbol can be replaced by any ground term, i.e., any
constant symbol or function symbol applied to ground terms only.
• The new KB is logically equivalent to the previous KB.
3. Existential Introduction
• If P(c) is true, then (∃x)P(x) is inferred.
• For example, from eats(Ziggy, IceCream) we can infer (∃x)eats(Ziggy, x).
• All instances of the given constant symbol are replaced by the new
variable symbol.
• Note that the variable symbol cannot already exist anywhere in the
expression.
• The new KB is not logically equivalent to old KB, but it will be satisfiable
if old KB was satisfiable.
4. Existential Elimination
• From (∃x)P(x) infer P(c).
• For example, from (Ex) (∃x)eats(Ziggy, x) infer eats(Ziggy, Cheese).
• Note that the variable is replaced by a brand new constant that does
not occur in this or any other sentence in the Knowledge Base.
• Example: Let's say that,
"Priyanka got good marks in English."
"Therefore, someone got good marks in English.“
• All we know is there must be some constant that makes this true, so
we can introduce a brand new one to stand in for that
(unknown) constant.
Entailment
• Entailment means that one thing follows from another: KB ╞ α
• Knowledge base KB entails sentences α if and only if α is true in all
worlds where KB is true
• E.g., the KB containing the Giants won and the Reds won entails
Either the Giants won or the Reds won
• E.g., x+y = 4 entails 4 = x+y
• Entailment is a relationship between sentences (i.e., syntax) that is
based on semantics
Proof System
• A proof is a sequence of sentences, where each sentence is either a
premise or a sentence derived from earlier sentences in the proof by
one of the rules of inference.
• The last sentence is the theorem (also called goal or query) that we
want to prove.
Unification
• Unification is a process of making two different logical atomic
expressions identical by finding a substitution. Unification depends
on the substitution process.
• It takes two literals as input and makes them identical using
substitution.
• Let Ψ1 and Ψ2 be two atomic sentences and 𝜎 be a unifier such
that, Ψ1𝜎 = Ψ2𝜎, then it can be expressed as UNIFY(Ψ1, Ψ2).
• Example: Find the MGU for Unify{King(x), King(John)}
Let Ψ1 = King(x), Ψ2 = King(John),
Substitution θ = {John/x} is a unifier for these atoms and applying this
substitution, and both expressions will be identical.
• if we can find a substitution θ such that King(x) and Greedy(x) match
King(John) and Greedy(y) , θ = {x/John, y/John} works
• Unify(α,β) = θ iff αθ = βθ

p q θ
Knows(John,x) Knows(John,Jane) {x/Jane}

Knows(John,x) Knows(y,OJ) {x/OJ, y/John}

Knows(John,x) Knows(y,Mother(y)) {x/Mother(John),


y/John}
Knows(John,x) Knows(x,OJ) {fail}
Convert the following sentences into FOL:
• Every gardener likes the sun.

• You can fool some of the people all of the time.

• You can fool all of the people some of the time.


• All purple mushrooms are poisonous.

• No purple mushroom is poisonous.

• There are exactly two purple mushrooms.


• Bush is not tall.

• X is above Y iff X is on directly on top of Y or there is a pile of one or


more other objects directly on top of one another starting with X and
ending with Y.
CONJUNCTIVE NORMAL FORMULA
(CNF)
• A sentence represented as a conjunction of clauses is said to
be conjunctive normal form or CNF.
• Steps for First Order Logic (FOL) conversion to CNF
• 1. Eliminate bi-conditionals and implications:
• Eliminate ⇔, replacing α ⇔ β with (α ⇒ β) ∧ (β ⇒ α).
• Eliminate ⇒, replacing α ⇒ β with ¬α ∨ β.
• 2. Move ¬ inwards:
• ¬(∀ x p) ≡ ∃ x ¬p,
• ¬(∃ x p) ≡ ∀ x ¬p,
• ¬(α ∨ β) ≡ ¬α ∧ ¬β,
• ¬(α ∧ β) ≡ ¬α ∨ ¬β,
• ¬¬α ≡ α.
3. Standardize variables apart by renaming them: each quantifier
should use a different variable.
4. Skolemize: each existential variable is replaced by a Skolem constant
or Skolem function of the enclosing universally quantified variables.
For instance, ∃x Rich(x) becomes Rich(G1) where G1 is a new Skolem
constant.
“Everyone has a heart” ∀ x Person(x) ⇒ ∃ y Heart(y) ∧ Has(x, y)
becomes ∀ x Person(x) ⇒ Heart(H(x)) ∧ Has(x, H(x)), where H is a new
symbol (Skolem function).
5. Drop universal quantifiers For instance, ∀ x Person(x) becomes
Person(x).
6. Distribute ∧ over ∨: (α ∧ β) ∨ γ ≡ (α ∨ γ) ∧ (β ∨ γ).
Exercise:
• Convert “Everybody who loves all animals is loved by someone” to CNF
• (∀x)[(∀y){Animal(y) → Loves(x,y)} → (∃y) Loves(y,x)]
Solution:
1. Eliminate implications: ∀x [¬∀y ¬Animal(y) ∨ Loves(x, y)] ∨ [∃y Loves(y,
x)]
2. Move ¬ inwards

• a) ∀x [∃y ¬(¬Animal(y) ∨ Loves(x, y))] ∨ [∃y Loves(y, x)]


• b) ∀x [∃y ¬¬Animal(y) ∧ ¬Loves(x, y)] ∨ [∃y Loves(y, x)] (De Morgan)
• c) ∀x [∃y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃y Loves(y, x)] (double negation)
3. Standardize variables:
∀x [∃y Animal(y) ∧ ¬Loves(x, y)] ∨ [∃z Loves(z, x)]
4. Skolemization:
∀x [Animal(F(x)) ∧ ¬Loves(x, F(x))] ∨ [Loves(G(x), x)]
5. Drop universal quantifiers:
[Animal(F(x)) ∧ ¬Loves(x, F(x))] ∨ [Loves(G(x), x)]
6. Distribute ∨ over ∧:
[Animal(F(x)) ∨ Loves(G(x), x)] ∧ [¬Loves(x, F(x)) ∨ Loves(G(x), x)]
Thank
You
Knowledge Representation
& Reasoning
Resolution
by
Dr. Vikas Goel
Department of IT
KIET Group of Institutions
Contents
 Resolution

 Algorithm

 Steps for Resolution

 Examples
RESOLUTION
• Resolution yields a complete inference algorithm when coupled with
any complete search algorithm.
• Resolution makes use of the inference rules.
• Resolution performs deductive inference.
• Resolution uses proof by contradiction.
• One can perform Resolution from a Knowledge Base.
• A Knowledge Base is a collection of facts or one can even call it a
database with all facts.
ALGORITHM
Let f be a set of given statements and S is a statement to be proved.
• 1. Covert all the statements of F to clause form.
• 2. Negate S and convert the result to clause form. Add it to the set of
clauses obtained in 1.
• 3. Repeat until either a contradiction is found or no progress can be
made or a predetermined amount of effort has been expended.
a. Select two clauses. Call them parent clauses.
b. Resolve them together. The resolvent will be the
disjunction of all of these literals of both clauses.
• If there is a pair of literals T1 and T2 such that one parent clause
contains T1 and the other contains T2 and if T1 and T2 are unifiable,
then neither T1 nor T2 should appear in the resolvent.
• Here T1 and T2 are called complimentary literals.
c. If the resolvent is the empty clause, then a contradiction has been
found. If it is not, then add it to the set of clauses available to the
procedure.
Steps for Resolution
1. Conversion of facts into first-order logic.
2. Convert FOL statements into CNF
3. Negate the statement which needs to prove (proof by contradiction)
4. Draw resolution graph (unification).
• To better understand all the above steps, we will take an example in
which we will apply resolution.
• Example:
• Convert the following sentence into predicate logic and then prove "Was
Marcus loyal to Caesar? using resolution:
1. Marcus was a man.
2. Marcus was a Pompeian.
3. All Pompeians were Romans.
4. Caesar was a ruler.
5. All Romans were either loyal to Caesar or hated him.
6. Everyone is loyal to someone.
7. People only try to assassinate rulers they are not loyal to.
8. Marcus tried to assassinate Caesar.
• Step-1: Conversion of Facts into FOL
• 1. Marcus was a man. man(Marcus)
• 2. Marcus was a Pompeian. Pompeian(Marcus)
• 3. All Pompeians were Romans. ∀x: Pompeian(x) → Roman(x)
• 4. Caesar was a ruler. ruler(Caesar)
• 5. All Romans were either loyal to Caesar or hated him.
∀x: Roman(x) → loyalto(X. Caesar) V hate(x, Caesar)
• 6. Everyone is loyal to someone.
∀x : → y: Ioyalto(x,y)
• 7. People only try to assassinate rulers they are not loyal to.
∀ x : ∀ y : person(x) ∧ ruler(y) ∧ tryassassinate(x,y) → ¬ Ioyalto(x,y)
• 8. Marcus tried to assassinate Caesar.
tryassassinate (Marcus, Caesar)
• Additional: ∀x: man(x) → person(x)
∀x: rman(x) V person(x)
• 9. rman(x) V person(x)
• Step-2: Conversion of FOL into CNF
• Eliminate all implication (→) and rewrite
• Move negation (¬)inwards and rewrite
• Rename variables or standardize variables
• Eliminate existential instantiation quantifier by elimination
• Drop Universal quantifiers
CNF
• man(Marcus)
• Pompeian(Marcus)
• ¬ Pompeian(x1) V Roman(x1)
• ruler(Caesar)
• ¬ Roman(x2) V loyalto(x2, Caesar) V hate(x2, Caesar)
• loyalto(x3, S1(x3))
• ¬ person(x4) V ¬ruler(y1) V ¬Vtryassassinate(x4, y1) V ¬loyalto(x4, y1)
• tryassassinate(Marcus, Caesar)
• Step-3: Negate the statement to be proved

• In this statement, we will apply negation to the conclusion


statements, which will be written as loyalto(M,C)
Resolution tree:
Step-4: Draw Resolution graph:
Thank
You
Knowledge Representation
& Reasoning
Forward & Backward Chaining
by
Dr. Vikas Goel
Department of IT
KIET Group of Institutions
Contents
 Inference engine

 Forward Chaining

 Backward Chaining

 Examples
Inference engine
• The inference engine is the component of the intelligent system in
artificial intelligence,
• which applies logical rules to the knowledge base to infer new
information from known facts.
• The first inference engine was part of the expert system.
• Inference engine commonly proceeds in two modes, which are:
• Forward chaining
• Backward chaining
Forward Chaining
• Forward chaining is also known as a forward deduction or forward
reasoning method when using an inference engine.
• Forward chaining is a form of reasoning which start with atomic
sentences in the knowledge base and applies inference rules in the
forward direction to extract more data until a goal is reached.
• The Forward-chaining algorithm
• starts from known facts,
• triggers all rules whose premises are satisfied, and
• add their conclusion to the known facts.

Example:
• "As per the law, it is a crime for an American to sell weapons to
hostile nations. Country A, an enemy of America, has some missiles,
and all the missiles were sold to it by Robert, who is an American
citizen."
• Prove that "Robert is criminal."
• To solve the above problem, first, we will convert all the above facts
into first-order definite clauses, and then we will use a forward-
chaining algorithm to reach the goal.
Facts Conversion into FOL
• It is a crime for an American to sell weapons to hostile nations.
American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) → Criminal(p)
...(1)
• Country A has some missiles. ∃p Owns(A, p) ∧ Missile(p). It can be
written in two definite clauses by using Existential Instantiation,
introducing new Constant T1.
Owns(A, T1) ......(2)
Missile(T1) .......(3)
• All of the missiles were sold to country A by Robert.
∀p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)
• Missiles are weapons.
Missile(p) → Weapons (p) .......(5)
• Enemy of America is known as hostile.
Enemy(p, America) →Hostile(p) ........(6)
• Country A is an enemy of America.
Enemy (A, America) .........(7)
• Robert is American
American(Robert). ..........(8)
Forward chaining proof
Step-1:
• In the first step we will start with the known facts and will choose the
sentences which do not have implications

• Owns(A, T1) ......(2)


Missile(T1) ......(3)
• Enemy (A, America) ......(7)
• American(Robert) .....(8)

Step-2:
• At the second step, we will see those facts which infer from available
facts and with satisfied premises.

• ∀p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)


• Missile(p) → Weapons (p) .......(5)
• Enemy(p, America) →Hostile(p) ........(6)
Step-3:
• At step-3, as we can check Rule-(1) is satisfied with the
substitution {p/Robert, q/T1, r/A}, so we can add
Criminal(Robert) which infers all the available facts. And hence we
reached our goal statement.
• American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) → Criminal(p)
Backward Chaining
• Backward-chaining is also known as a backward deduction or
backward reasoning method when using an inference engine.
• A backward chaining algorithm is a form of reasoning,
• which starts with the goal and works backward,
• chaining through rules to find known facts that support the goal.
Example:
• In backward-chaining, we will use the same above example, and will
rewrite all the rules.
• American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) → Criminal(p)
...(1)
Owns(A, T1) ........(2)
• Missile(T1)
• ∀p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)
• Missile(p) → Weapons (p) .......(5)
• Enemy(p, America) →Hostile(p) ........(6)
• Enemy (A, America) .........(7)
• American(Robert). ..........(8)
Backward-Chaining proof
• In Backward chaining, we will start with our goal predicate, which
is Criminal(Robert), and then infer further rules.
Step-1:
• At the first step, we will take the goal fact.
• And from the goal fact, we will infer other facts, and at last, we will
prove those facts true.
• So our goal fact is "Robert is Criminal," so following is the predicate
of it.

Criminal(Robert)
Step-2:
• At the second step, we will infer other facts form goal fact which
satisfies the rules.
• So as we can see in Rule-1, the goal predicate Criminal (Robert) is
present with substitution {Robert/P}.
• So we will add all the conjunctive facts below the first level and will
replace p with Robert.
• American (p) ∧ weapon(q) ∧ sells (p, q, r) ∧ hostile(r) → Criminal(p)
...(1)
Step-3:
• At step-3, we will extract further fact Missile(q) which infer from
Weapon(q), as it satisfies Rule-(5).
• Weapon (q) is also true with the substitution of a constant T1 at q.
• Missile(p) → Weapons (p) .......(5)
Step-4:
• At step-4, we can infer facts Missile(T1) and Owns(A, T1) form
Sells(Robert, T1, r) which satisfies the Rule- 4, with the substitution
of A in place of r. So these two statements are proved here.
• ∀p Missiles(p) ∧ Owns (A, p) → Sells (Robert, p, A) ......(4)
Step-5:
• At step-5, we can infer the fact Enemy (A, America) from Hostile (A)
which satisfies Rule- 6.
• And hence all the statements are proved true using backward
chaining.
• Enemy(p, America) →Hostile(p) ........(6)
Thank
You

You might also like