PPTs - Set Theory & Probability (L-1 To L-6)
PPTs - Set Theory & Probability (L-1 To L-6)
PPTs - Set Theory & Probability (L-1 To L-6)
ASSESSMENT CRITERIA
QUIZ
MID TERM EXAMINATION –I & II
END TERM EXAMINATION
PROGRAM
OUTCOMES
MAPPING WITH
CO1
LTPC: 2103
INTRODUCTION
This course is offered by Dept. of Mathematics & Statistics as a regular course to
make the students acquainted with the subject of probability and statistics at an
early stage. Probability and statistics is an important foundation for computer
science fields such as machine learning, artificial intelligence, computer graphics,
randomized algorithms, image processing, and scientific simulations. In this
course, students will expand their knowledge of probabilistic methods and apply
them to diverse computational problems. The first part of the course offers in
depth knowledge of probability theory (random event, probability, characteristics of
random variables, probability distributions and moment generating functions)
which is necessary for simulation of random processes. In the second part,
sampling theory is discussed. Each concept is explained through various examples
and application-oriented problems.
Course Outcomes: At the end of the course, students will be able to
[2201. 1] Apply the concept of probability and related theorems in solving various real world problems.
Understand the key concept of random variable, its probability distributions including mean,
[2201.2] expectation, variance and moments.
Implement the variation and the relation between two random variables by using the concept of
[2201.3] correlation.
Comprehend the concept of random sample and its sampling distribution which will enhance the
[2201.4] logical & analytical skills.
Apply the statistics for testing the significance of the given large and small sample data by using
[2201.5] t-test, F-test and Chi-square test.
ASSESMENT PLAN
Total 100
SYLLABUS
Basic Set theory, Axioms of probability, Sample space, conditional probability, total
probability theorem, Baye's theorem. One dimensional and two dimensional random
variables, mean and variance, properties, Chebyschev's inequality, correlation
coefficient, Distributions, Binomial, Poisson, Normal and Chisquare. Functions of
random variables: One dimensional and Two dimensional, F & T distributions,
Moment generating functions, Sampling theory, Central limit theorem, Point
estimation, MLE, Interval estimation, Test of Hypothesis: significance level, certain
best tests; Chi square test.
REFERENCES BOOKS:
2. Miller, Freund and Johnson, Probability and Statistics for Engineers, (8e), Prentice
Hall of India, 2011.
Repeated elements in sets are not allowed. In other words, {1, 2, 3, 3} = {1, 2, 3}. If we want to allow
repeats, we can use a related object called a multiset.
We will usually denote a set by a capital letter.
“and”.
There are often multiple ways to describe a set, e.g.,
{x ∈ R | x2− 5x + 6 = 0} = {x | x ∈ R, x2− 5x = −6} = {2, 3}.
A set is finite if it has a finite number of elements. Otherwise, it is an infinite set.
The number of elements in a set A is called its cardinality, denoted |A|. If A is infinite, we may write
|A| = ∞.
Definition
Let A and B be sets. We say that A is a subset of B if (and only if) every element of A is an element of B. We write
this as A ⊆ B, or B ⊇ A.
SET COMPLEMENTS
Frequently, we will need to establish the set of all elements U under consideration, which we
call the universal set.
Definition
The complement of a set A is the set of all elements in U that are not in A:
Ac = {x ∈ U | x ∉ A}.
RELATIVE COMPLEMENTS
Definition
For sets A and B, the complement of A relative to B is the set of elements that are in B but
not A:
B − A = {x | x ∈ B and x ∉ A}.
The symmetric difference of A and B is the set of elements that are in one of these sets, but
not the other:
A ⊕ B = (A − B) ∪ (B − A).
The complement of A relative to B can be denoted A \ B.
Exercises
Compute A − B, B − A, and A ⊕ B in the following cases:
1. A = {1, 3, 8} and B = {2, 4, 8}
2. Any set A, and B = ∅.
3. A = R and B = Q.
Venn diagrams
A useful way to visualize a small number of sets and their intersections, unions, and relative
complements, is with a Venn diagram.
Social media has caused these to become mainstream, though they are often used
incorrectly.
CARTESIAN PRODUCTS
Definition
The Cartesian product of sets A and B is the set of ordered pairs:
A × B = {(a, b) | a ∈ A, b ∈ B}.
Examples
Let A = {1, 2, 3} and B = {4, 5}. Then
A × B= B × A= A× A =
Similarly, we can define the Cartesian product of three (or more) sets. For example,
A × B × C = {(a, b, c) | a ∈ A, b ∈ B, c ∈ C}.
It is common to use exponents if the sets are the same, e.g.,
A2 = A × A, A3 = A × A × A, . . .
POWER SETS
Definition
The power set of A is the set of all subsets of A, denoted P(A). (Including both ∅ and A.)
Examples
1. P(∅) = {∅}
2. P({1}) = {∅, {1}}
3. P({1, 2}) = {∅,{1}, {2}, {1, 2}}.
SUMMATION NOTATION
Addition is a binary operation that is associative, which means that parentheses are
permitted anywhere but required nowhere.
the values below and above the summation symbol are the initial index and terminal
index, respectively.
(a) A1 ∩ A2 ∩ · · · ∩ An = ∩ Ai
i =1
(b) A1 ∪ A2 ∪ · · · ∪ An = ∪ Ai
i =1
(c) A1 × A2 × · · · × An = × Ai
i =1
(d) A1 ⊕ A2 ⊕ · · · ⊕ An = ⊕ Ai .
i =1
DISTRIBUTIVE LAWS
See if you can find a general fomula for the following two expressions by looking at the
cases where n = 2 and drawing a Venn diagram:
PROBABILITY THEORY
Basics
Probability theory deals with the study of random phenomena, which under
repeated experiments yield different outcomes that have certain
underlying patterns about them. The notion of an experiment assumes a
set of repeatable conditions that allow any number of identical repetitions.
When an experiment is performed under these conditions, certain
elementary events occur in different but completely uncertain ways. We
can assign nonnegative number as the probability of the event in various
ways:
Laplace’s Classical Definition: The Probability of an event A is defined a-
priori without actual experimentation as
A B A B A
A1
A2
A B Ai
Aj An
A B =
De-Morgan’s Laws:
A B = A B ; A B = A B (1-10)
A B A B A B A B
A B A B A B
(Note that (iii) states that if A and B are mutually exclusive (M.E.)
events, the probability of their union is the sum of their probabilities.)
The following conclusions follow from these axioms:
a. Since A A = , we have using (ii)
P( A A) = P() = 1.
But A A , and using (iii),
P( A A) = P( A) + P( A) = 1 or P( A) = 1 − P( A). (1-13)
b. Similarly, for any A, A = .
Hence it follows that P( A ) = P( A) + P( ) .
But A = A, and thus P = 0. (1-14)
c. Suppose A and B are not mutually exclusive (M.E.)?
How does one compute P( A B ) = ?
To compute the above probability, we should re-express A B in terms of
M.E. sets so that we can make use of the probability axioms. From Fig. we
have A B = A AB, (1-15)
where A and AB are clearly M.E. events. A AB
Thus using axiom (1-12-iii)
A B
P ( A B ) = P ( A AB ) = P ( A) + P ( AB ).
B = B = B ( A A) (1-16)
= ( B A) ( B A) = BA B A (1-17)
P( AB ) = P ( B ) − P( AB)
(1-19)
Among the NA occurrences of A, only NAB of them are also found among
the NB occurrences of B. Thus the ratio
N AB N AB / N P( AB) (1-22)
= =
NB NB / N P( B )
1
is a measure of “the event A given that B has already occurred”. We
denote this conditional probability by
P(A|B) = Probability of “the event A given that B has occurred”.
We define
P( AB)
P( A | B ) = , (1-23)
P( B )
2
We have
(i) P( AB) 0
P( A | B ) = 0, (1-24)
P( B ) 0
P(B ) P( B )
(ii) P ( | B ) = = = 1, since B = B. (1-25)
P( B ) P( B )
P( AB) P( A)
P( A | B ) = = P( A). (1-41)
P( B ) P( B )
4
(In a dice experiment, A = {outcome is 2}, B={outcome is even},
so that A B. The statement that B has occurred (outcome is even)
makes the odds for “outcome is 2” greater than without that
information).
c. We can use the conditional probability to express the probability of a
complicated event in terms of “simpler” related events.
Total Probability Theorem
Let A1, A2 ,, An are pair wise disjoint and their union is . Thus
and Ai A j = ,
n
Thus A i = . (1-29)
i =1
But
6 6 3
P (W1 ) = = = ,
6 + 4 10 5
and 4 4
P ( B2 | W1 ) = = ,
5+4 9
and hence
5 4 20
P(W1B2 ) = = 0.25.
9 9 81
Are the events W1 and B2 independent? Our common sense says No. To
verify this, we need to compute P(B2). Of course, the fate of the second ball
very much depends on that of the first ball. The first ball has two options:
W1 = “first ball is white” or B1= “first ball is black”. Note that W1 B1 = ,
and W1 B1 = . Hence W1 together with B1 form a partition. Thus (see (1-
23)-(1-25))
P( B2 ) = P( B2 | W1 ) P(W1 ) + P( B2 | R1 ) P( B1 )
4 3 3 4 4 3 1 2 4+2 2
= + = + = = ,
5 + 4 5 6 + 3 10 9 5 3 5 15 5
and 2 3 20
P( B2 ) P(W1 ) = P( B2W1 ) = .
5 5 81
P( BA) P( AB)
Similarly, from (1-22) P( B | A) = = ,
P( A) P( A)
P( B | A ) P( A )
P( B )
i i
i =1