AI (Islamic University of Science and Technology Pulwama)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 37

To

fo p
co rm rov
nt as e o
ra c r
di la di W
ct us sp as
T M io e ro
Do his ar
cu
n. s v
Le an e t
es ca
n s t's d t he
Th M b L oy re hen sta
N is ar e
cu re a pr a te
ow c lt
a s p o es p m
(a ,l n
et be ha res C en ply en
)M 's te en ae t t th ts
sa he e u
T ar
cu
re rep C ted
ae r?
(b his pr re a
gi res sing
s es se sa s: ve o
) M ca w e n r ? L n lut res
bfs-and-dfs/
ar n b as nt te oy st io o
(c is
Th c a the a d al at n lu
) A ca
us e r
e m g s : (M em in tio
ll n w pr an iv H ar en fere n,
Th Po b as e . e n te a c us ts nc we
(d is m e re a sen
P p (M , as e ne
) C ca pe p re a
m C
n r
om ted r a cl rule ed
au t t
Th ae ia es
n is cu es
(e is s ar
be s e pe as
ia M : e s ,C s a r) se o s o r
w nte (a s: ee ep
) A ca
ll n
w rep e re d a
n. a
n( ) t ae if res
Th Ro b
as re
s M o s
m e a en
r
Ro s:
P ar (i) ar) w en
e
(f) is r ul te m cu a s ca t th
Ev ca
n
an ep
r e r d an om s c
er s es . ) la
n em
Th y be w e
as
:
s. pe
i u de
(g is on r er nt ∀x an se riv in
) P ca e ep e ed
ei (P ( M s : e al
is re t a a og
Th eo n
pl e b l oy e s h o m
ar
cu ic
(h is e r a n er s: R
p s al
) M ca l t ted lo u ei )
n
on ep
ly es r o ya le a n(
Th ar so as l t r(C x
(i) is c be t r e m ∀ : o a )
us r y nt C es →
Al ca t rie re
ep to ed eo x a es ar) Ro
lm n as a ne (Ro
Th
en be d se sa s: . m ar m
N is to n ss ∀x an or an
ow ca ar rep as ted in ∃ (x ha (x
,l n e re sa a at y )→ te ))
Re et be
's
pe s
e s s s: e Lo d
Fr so re op n ru y (L
S o l v a p l e te in ∀x
a l e a l o hi
m
m e pp re . da te ∃y rs (x, ya .
H ubs (e) (e) ly se s: C t he y) l (x,
ae (Tri C
R ate tit : ∀ an re nt
so ed Tr
ie sa ed y
first-search-and-a-search/

ar ae
Fr eso (M ute x (R d ( lu
tio as:
dA r. As
sa e sa
S om lve arc x w om h): ss no r)
n ∀x as ss ∨
Re ubs (c) (c) us, ith an t o s i n
tl
o H
Fr so titu : ∀ an Ca M (x) in at ya at
Su m ve teo l x d e a se (Ma
e n( a te e (x l to e(
( x (P (b sa rcu →
o r ( i f x ( M , . x,

climbing-artificial-intelligence/
R b (i) a w m ): )) s a Lo w )→ y) C
Fr eso stitu : ∀x ), (i) ith pe nd ya e ar
cu → ae
ca P sa
Su om lve te (M , an Ma ian si l(x,
m n eo s, ¬L
C de pl C oy r))
)
Re bs (f) (f) x w an d rc (x) p riv e(x a al
Fr so titu : ∀x an ith (x) (2): us →
an R
lif aes
y e )
es
ar ( x
Su om lve te ∃y d (3 M →
x a d o us ar) a ) ) ,y
)
in ∨ )

¬Loyal(Marcus, y))
Resolve (f) and (3):
co

Resolve (e) and (h):

Resolve (c) and (b):

Resolve (g) and (h):


rc Pe

Resolve (d) and (e):


g H

(i) All men are people.


¬L bs (g (g) w Lo ):

(d) Caesar was a ruler.


(a) Marcus was a man.
) o si ma n

Hate(Marcus, Caesar))
u m

Resolve (a), (i), and (2):


t :

From (d): Ruler(Caesar)

¬Hate(Marcus, Caesar)
p

From (f): ∀x ∃y Loyal(x, y)


s

Does Marcus hate Caesar?

∃y Loyal(Marcus, Caesar)
(h at

(b) Marcus was a Pompeian.


N oy itu ∀ an ith ya

Was Marcus Loyal to Caesar?


e tra

From (4): ∃y Loyal(Marcus, y)


x

Now, let's resolve (6) and (7):

Now, let's resolve (8) and (9):


t d pl n(x) ):

Now, let's consider (d) and (e):


M l(

(c) All Pompeians were Romans.

Resolve (10): ∀x ∃y Loyal(x, y)


ow a

(f) Everyone is loyal to someone.

Now, we have a contradiction:


From (i): ∀x (Man(x) → People(x))

Now, let's resolve (10) and (11):


∃y ¬TriedAssassinate(Marcus, y)
an le(

Now, let's resolve (12) and (13):


ify ) di

Now, let's try to resolve (4) and (5):

¬TriedAssassinate(Marcus, Caesar)
Ro (x,

(h) Marcus tried to assassinate Caesar.


, l l(M e x ∃y (h ar x, y d x)) ct

From (c): ∀x (Pompeian(x) → Roman(x))


This can be represented as: Man(Marcus)

Resolve (12): ∃y Loyal(Marcus, Caesar)


Fr

This can be represented as: Ruler(Caesar)


us io

From (7): ∃y ¬TriedAssassinate(Marcus, y)


et ar w (T ): cu ) si m Ca

This can be represented as: ∀x ∃y Loyal(x, y)


r m

TriedAssassinate(Marcus, Caesar) (from (h))


in

This can be represented as: Pompeian(Marcus)


s

Resolve (13): ∃y ¬TriedAssassinate(Marcus, y)


Loyal(Marcus, Caesar) ∨ Hate(Marcus, Caesar)
Fr om 's cu it n:

This can be represented as: Hate(Marcus, Caesar)


g an es

This can be represented as: Loyal(Marcus, Caesar)

Resolve (8): Pompeian(Marcus) → Roman(Marcus)


This can be represented as: ∀x (Man(x) → People(x))

on the provided information and logical reasoning.


pl

Using resolution, we can derive the following clause:

Using resolution, we can derive the following clause:


(

From (g): ∀x ∃y (TriedAssassinate(x, y) → ¬Loyal(x, y))

Using resolution, we can derive the following clause:


try s, h M ied an i

Using resolution, we can derive the following clause:

Using resolution, we can derive the following clause:


A

Using resolution, we can derive the following clause:


(M ar

(e) All Romans were either loyal to Caesar or hated him.


U om (4)
d f y b

From (6): Loyal(Marcus, Caesar) ∨ Hate(Marcus, Caesar)


This can be represented as: ∀x (Pompeian(x) → Roman(x))
Now, let's represent the given premises (a) to (i) as clauses:

¬TriedAssassinate(Marcus, Caesar) (from the resolution)


si

(g) People only try to assassinate rulers they are not loyal to.
to y)) ar ss ): ar )))

contradiction. Let's represent the given statements as clauses:


s

This can be represented as: TriedAssassinate(Marcus, Caesar)

From (e): ∀x (Roman(x) → (Loyal(x, Caesar) ∨ Hate(x, Caesar)))


c P c

From (e): ∀x (Roman(x) → (Loyal(x, Caesar) ∨ Hate(x, Caesar)))


From (5): ∃y (TriedAssassinate(Marcus, y) → ¬Loyal(Marcus, y))
ng (5) : ∃y
r a i us

Now, let's apply resolution to see if we can derive a contradiction:


m

Substitute x with Marcus and simplify using (3): ∃y Loyal(Marcus, y)


∃y es us ss
p in om us
re : ∃y Lo g )

Pompeian(Marcus) → (Loyal(Marcus, Caesar) ∨ Hate(Marcus, Caesar))


This can be represented as: ∀x ∃y (TriedAssassinate(x, y) → ¬Loyal(x, y))
N ¬T so ( y o lv an na i lif →
e d t y (2 pe

This can be represented as: ∀x (Roman(x) → (Loyal(x, Caesar) ∨ Hate(x, Caesar)))


ow ri

Resolve (9): Roman(Marcus) → (Loyal(Marcus, Caesar) ∨ Hate(Marcus, Caesar))


To prove or disprove the statements using resolution, we need to represent them in
lu Tri al(M ia (L

Substitute x with Marcus and simplify using (h): ∃y (TriedAssassinate(Marcus, y) →


form as clauses and then apply the resolution inference rule to see if we can derive a
(4 us )a

Resolve (11): Pompeian(Marcus) → (Loyal(Marcus, Caesar) ∨ Hate(Marcus, Caesar))

not possible to determine whether Marcus was loyal to Caesar or whether he hated
R tio ed

Substitute x with Marcus and simplify using (b): Pompeian(Marcus) → Roman(Marcus)

Since we have derived a contradiction, the original set of statements is inconsistent,


, l ed si e(x

Substitute x with Marcus and simplify using (2) and (a): Man(Marcus) → People(Marcus)
in n( oy
nd M

Substitute x with Marcus and simplify using (h): Roman(Marcus) → (Loyal(Marcus, Caesar)
a logical
)a

can conclude that the system of statements cannot all be true simultaneously. Therefore,
et As m ,y


's s n, As arc g

and we
al

it is
Fr eso pl (a

him based
w sa us nd (3 ar (M
Fr om lve co as e ss , y (5
ify ) →
): ):
M
cu
U m (d (do n si na
si c an in ) ): u si Lo ¬ ∃y an s ar
cu
si ): ) d t a t n g y L (
)→
ng (e) R an er e(
M
de e
( al oy M s,
C
Lo : u d ( r iv
(M h ( x a a Ro
re ∀x le (
r e
d) a
rc ):
∃ ,y l(M rc
u m ae
N ya
ow l(M
so (C ) a nd us
e arc
th u y ) ar s) a n( sa
,
lu (Ro a :
t e , e s , (T )
rie c → M r)

Fr io m sa
n
(e y) us P
Fr m o
le arc
t 's us , a n r ) :
fo y)
l l ow → d , eo
ar
c
w (x ) As y)
p
us
U om (6 re , C e ) i ng L ¬ s le )
si so a
lv e
ca → o as
s ( M
ng (7) ): L n (L cl ya
a in
¬H re : ∃y oya e sa
(6 r) d o u l ( a ar
c
N at so ¬ l(M er ya se Ma te us
ow e lu Tr a
)a ∨ iv l(x
e : rc (M )
, l (Ma tio ie rc nd Ha th , C us ar
Re et r n, dA us (7 te(M e ae ,y cu
): fo s
https://www.geeksforgeeks.org/introduction-hill-

R so 's cu w ss , C )) s,
U eso lve re s, e a a
ar
c
llo ar
w ∨ ) y)
si so C ca ss es us in →
ng lve (8) lv ae n in a , g H
C
https://www.geeksforgeeks.org/difference-between-

P re (9) : Po
e sa
(8 r) de ate r) ∨ ae cl ate
N om so : R m )a riv (M H sa au (x
ow pe lu om pe nd e ar at r) se , C
tio i th cu e( : ae
Re , l ian (9 e
Re so
et
's (M
n, an( an(
w M M ): f o
s, Ma
y r
sa
r))
llo ) cu )
U so lve re arc
so u
e ar ar
ca cu cu w s
si in ,C
ng lve (10 lv s)
e →
n s) s) g ae
∃y ( de → → cl s
re (11 ): ∀ 10 (L a
N Lo so ): x ) o
riv (L R
e o o us ar
)
ow y lu P ∃ an ya th ya ma e:
R , l al(M tio om y L d l(M e l(M n
R eso
et
's ar n, p oy (1 fo (
w eia al( 1) arc llo arc Ma
U eso lve re cu
so s, e : us w us rcu
si ca n(M x, y
a ) ,C in
ng lve (12 lv Ca
e n g , C s)
https://www.geeksforgeeks.org/difference-between-best-

¬ re (13 ): ∃ de rcu ae cl ae
N Trie so ): y (1 esa riv s) sa au s
2) r)
ow d lu ∃y Lo e → ∨
se ar)
, w As tio ¬ y an r)
d th (L H : ∨
Tr
e sas n, Tr al( (1 e at H
¬T ied h s w ie M 3 fo oya e at
Si rie As av in e dA arc ): llo l(
M (M e(
M
n
c c A a d s e a c s w a
an sa us in arc rc ar
no an c e w ss ssi a te(M
c de s C s , g u s,
cu
on t p on e ass nat on ar
r i n a cl us,
a C
s,
tra cu iv at es us Ca C
th os clu hav ina e(M d s e e a e e
ae ae
e sib d e te a ic , C th (M r) : sa sa sa
pr le e de (M rc tio a e a r) r )) r))
ov t th ri a us fo rc ∨
id o a v r ,
n: esa l lo s, u H
r) w a
ed de t th ed cus Ca in ) y te
in ter e s a c , C esa g ( M
fo m ys on ae r cl ar
rm ine te tr s ) (f au
se c us
at w m ad ar) rom
io h of ic (fr
n et s tio o (h : ,C
an he ta n m )) ae
d r M tem , th th sa
lo e r))
gi arc en e o re
ca u ts rig so
l r s w c in lu
ea a an al tio
so s no se n)
ni loy t a t o
ng a ll f
. l to be sta
C tru tem
ae e
sa si en
r o m ts
r w ulta is i
he ne nco
th ou ns
er sl is
he y. T te
ha he nt,
te ref an
d or d
hi e, we
m it
ba is
se
d
The Chinese room argument holds that a
digital computer executing a program cannot
have a "mind", "understanding", or
"consciousness", regardless of how
intelligently or human-like the program may
make the computer behave

imagine a person who doesn’t understand Chinese is placed in a room with a set of instructions in English for manipulating Chinese
symbols. The person receives questions in Chinese through a slot in the door and uses the instructions to produce a response in
Chinese, which is then passed back through the slot. From the outside, it appears as though the person understands Chinese and is
able to answer questions, but in reality, the person is just following a set of rules without actually understanding the meaning of the
symbols.
Searle argues that this thought experiment demonstrates that a computer program that simulates human understanding of language,
such as a chatbot, does not truly understand the meaning of the language it is processing. The program is just following a set of rules
without actually understanding the meaning of the language
A rational agent or rational being is a person or entity that always aims
to perform optimal actions based on given premises and information

Imagine that our intelligent agent is a robot vacuum cleaner.


Let's suppose that the world has just two rooms. The robot can be in either room and
there can be dirt in zero, one, or two rooms.

PEAS stands for a Performance measure, Environment, Actuator, Sensor.


1 Performance Measure: Performance measure is the unit to define the success of an agent. Performance varies with agents based on
their different precepts.
• Environment: Environment is the surrounding of an agent at every instant. It keeps changing with time if the agent is set in motion.
There are 5 major types of environments:
• Fully Observable & Partially Observable
Agent Performance Environment Actuator Sensor
• Episodic & Sequential Measure
• Static & Dynamic Patient’s health,
Hospital Hospital, Prescription, Symptoms,
• Discrete & Continuous Managemen
Admission
process,
Doctors, Diagnosis, Patient’s
t System Patients Scan report response
• Deterministic & Stochastic Payment

2 Actuator: An actuator is a part of the agent that delivers the output of action to the environment. The comfortable Steering
Camera,
Automated trip, Safety, Roads, Traffic, wheel,
3 Sensor: Sensors are the receptive parts of an agent that takes in the input for the agent. Car Drive Maximum Vehicles Accelerator,
GPS,
Odometer
Distance Brake, Mirror

Maximize Classroom,
Smart Eyes,
Subject scores, Desk, Chair,
displays, Ears,
Tutoring Improvement is Board, Staff,
Corrections Notebooks
students Students

Percentage of Camera,
Part-picking Conveyor belt Jointed arms
parts in correct joint angle
robot with parts; bins and hand
bins sensors

Satellite
Display
image Correct image Downlink from Color pixel
categorization
analysis categorization orbiting satellite arrays
of scene
system
https://www.geek
A search tree is used to model the sequence of actions. It is constructed with initial
state as the root. The actions taken make the branches and the nodes are results of
those actions. A node has depth, path cost and associated state in the state space.
t.com/hill-climbing-algorithm-in-ai
https://www.javatp

rules of thumb based on previous experiences, awareness of


approaches, and which are good to work but not guaranteed

Artificial intelligence system has the following components for displaying intelligent behavior:

◦ Perception
◦ Learning
◦ Knowledge Representation and Reasoning
◦ Planning
◦ Execution

showing how an AI system can interact with the real world and what components help it to show intelligence. AI system has Perception
component by which it retrieves information from its environment. It can be visual, audio or another form of sensory input. The learning
component is responsible for learning from data captured by Perception comportment. In the complete cycle, the main components are
knowledge representation and Reasoning. These two components are involved in showing the intelligence in machine-like humans. These two
components are independent with each other but also coupled together. The planning and execution depend on analysis of Knowledge
representation and reasoning.
https://ww
ps://www.javatpoint.com/propositional-logic-in-artificial-intelligence
A tautology is a statement that is true
in every row of the table. It's a
contradiction if it's false in every row.
The statement is a contingency if it is
neither a tautology nor a contradiction
—that is, if there is at least one row
where it is true and at least one row
where it is untrue
https://www.jav
mathematics#:~
%20terms.

You might also like