Chapter3-Discrete Distribution

Download as pdf or txt
Download as pdf or txt
You are on page 1of 141

Chapter 3: Discrete distribution

Shilpa G.
For a given sample space S of some experiment, a random variable
(rv) is any rule that associates a real number with each outcome in
S.
For a given sample space S of some experiment, a random variable
(rv) is any rule that associates a real number with each outcome in
S.

In mathematical language, a random variable is a function whose


domain is the sample space and whose range is the set of real
numbers.
For a given sample space S of some experiment, a random variable
(rv) is any rule that associates a real number with each outcome in
S.

In mathematical language, a random variable is a function whose


domain is the sample space and whose range is the set of real
numbers.
Example
• Experiment is observing total of numbers appearing on the
top after throwing two dice.
• Experiment is to see if root of given polynomial is real.
• Experiment is to see at which trial success occurs.
• Experiment is choosing any number in [0, 3] randomly.
Random variable

Discrete Continuous

Finite countable Union of intervals


Definition
A random variable is discrete if it can assume at most a finite or a
countably infinite number of possible values.
Conv.: We use capital letters, such as X , Y , etc. Observed values
would be denoted by small letters, like x, y , etc.
Definition
A random variable is discrete if it can assume at most a finite or a
countably infinite number of possible values.
Conv.: We use capital letters, such as X , Y , etc. Observed values
would be denoted by small letters, like x, y , etc.
Example
• Y = Number of heads if coin is tossed 3 times.
• Remeber- ‘Time’ (in its basic form) is not a discrete random
variable.
Definition
Let X be a discrete random variable. The function f given by

f (x) = P[X = x]

for x real is called the (probability) density function for X .


Remark: Some authors use letter 0 p 0 instead of 0 f 0 to denote PDF.
Also, some authors call it (probability) mass function.
Definition
Let X be a discrete random variable. The function f given by

f (x) = P[X = x]

for x real is called the (probability) density function for X .


Remark: Some authors use letter 0 p 0 instead of 0 f 0 to denote PDF.
Also, some authors call it (probability) mass function.

Necessary and sufficient conditions for a function to be a


discrete density:

f (x) ≥ 0,
X
f (x) = 1.
all x
Example
1. Tossing a coin 3 times. X is total number of heads appear.
(Remember: when no information is provided, we use classical
definition of probability.)
(
(1/2)y y = 1, 2, 3, . . .
2. f (y ) =
0 otherwise.
Example
1. Tossing a coin 3 times. X is total number of heads appear.
(Remember: when no information is provided, we use classical
definition of probability.)
(
(1/2)y y = 1, 2, 3, . . .
2. f (y ) =
0 otherwise.

Check whether each of the following is a PDF.


x−2
1. f (x) = 2 for x = 1, 2, 3, 4.
(x−3)2
2. f (x) = 5 for x = 3, 4, 5.
x2
3. f (x) = 25 for x = 0, 1, 2, 3, 4.
Definition
Let X be a discrete random variable with density f . The
commulative distribution function for X , denoted by F , is defined
by
F (x) = P[X ≤ x] for x real.
Definition
Let X be a discrete random variable with density f . Let H(X ) be a
random variable. The expected value of H(X ), denoted by
E [H(X )], is given by
X
E [H(X )] := H(x)f (x)
all x
P
provided |H(x)|f (x) is finite.
all x
Definition
Let X be a discrete random variable with density f . Let H(X ) be a
random variable. The expected value of H(X ), denoted by
E [H(X )], is given by
X
E [H(X )] := H(x)f (x)
all x
P
provided |H(x)|f (x) is finite.
all x

E [X ] is called mean of the random variable X . It is denoted by µ


or µX . It is also called expected value, (weighted) average value,
mean value. It is also refered to as a ‘location’ parameter.
Remark
If X is a discrete random variable then so is H(X ) where H is a
function
P of X . So, while calculating E [H(X )] either we use formula
H(x)f (x) directly or we consider a new discrete random variable
Y := H(X ), write f (Y ) and calculate E [Y ].
Remark
If X is a discrete random variable then so is H(X ) where H is a
function
P of X . So, while calculating E [H(X )] either we use formula
H(x)f (x) directly or we consider a new discrete random variable
Y := H(X ), write f (Y ) and calculate E [Y ].

Example

X -3 -1 0 1 2 4 5
f(X) .1 .2 .1 .3 .1 .1 .1
with H(X ) = X (X + 2).
Take Y = H(X ). Then, P[Y = y ] = P[X = x1 , x2 , . . . |H(xi ) = y ].
Hence, we get

Y -1 0 3 8 24 35
f(Y) .2 .1 .1+ .3 .1 .1 .1
Theorem
Let X and Y be random variables and let c be any real number.
Then,
1. E [c] = c.
2. E [cX ] = cE [X ].
3. E [X + Y ] = E [X ] + E [Y ].
Definition
We say two variables X and Y are independent iff

P[X = x, Y = y ] = P[X = x]P[Y = y ]

for all observed values x of X and y of Y .


Definition
We say two variables X and Y are independent iff

P[X = x, Y = y ] = P[X = x]P[Y = y ]

for all observed values x of X and y of Y .


Remark: We will see precise definition of independence when we
study PDF of two variables.
Definition
We say two variables X and Y are independent iff

P[X = x, Y = y ] = P[X = x]P[Y = y ]

for all observed values x of X and y of Y .


Remark: We will see precise definition of independence when we
study PDF of two variables.
Fact
(i) Let X and Y be independent random variables. Then, f (X )
and g (Y ) are independent for any choice of continuous functions f
and g of X and Y respectively.
(ii) Let X and Y be independent random variables. Then,

E [XY ] = E [X ]E [Y ].
Definition
Let X be a random variable. The k-th ordinary moment for X is defined
as E [X k ] where k is a natural number. The k-th moment around µX for
X is defined as E [(X − µX )k ] where k is a natural number.
Definition
Let X be a random variable. The k-th ordinary moment for X is defined
as E [X k ] where k is a natural number. The k-th moment around µX for
X is defined as E [(X − µX )k ] where k is a natural number.

Definition
Let X be a random variable with mean µ. The variance of X , denoted by
σ 2 , is given by
Var X = σ 2 := E [(X − µ)2 ].
Definition
Let X be a random variable. The k-th ordinary moment for X is defined
as E [X k ] where k is a natural number. The k-th moment around µX for
X is defined as E [(X − µX )k ] where k is a natural number.

Definition
Let X be a random variable with mean µ. The variance of X , denoted by
σ 2 , is given by
Var X = σ 2 := E [(X − µ)2 ].

Theorem
σ 2 = E [X 2 ] − E [X ]2 .
Definition
Let X be a random variable. The k-th ordinary moment for X is defined
as E [X k ] where k is a natural number. The k-th moment around µX for
X is defined as E [(X − µX )k ] where k is a natural number.

Definition
Let X be a random variable with mean µ. The variance of X , denoted by
σ 2 , is given by
Var X = σ 2 := E [(X − µ)2 ].

Theorem
σ 2 = E [X 2 ] − E [X ]2 .

Definition
Let X be a random variable with variance σ 2 . The standard deviation of
X , denoted by σ, is given by
√ √
σ := Var X = σ 2 .
Theorem
Let X and Y be random variables and c be any real number. Then,
(i) Var c = 0.
(ii) Var cX = c 2 Var X .
(iii) If X and Y are independent, then
Var (X + Y ) = Var X + Var Y .
Moment generating function
Moment generating function
• Generating functions bridge between sequences of numbers
and the world of calculus.
• In probability, they are useful for studying both discrete and
continuous distributions.
• Idea; starting with a sequence of numbers,create a continuous
function-the generating function-that encodes the sequence.
Apply the tools of calculus at our disposal for manipulating
the generating function.

Definition
Let X be a random variable with density f . The moment
generating function for X , denoted by mX (t), is given by

mX (t) := E [e tX ]

provided this expectation is finite for all real numbers t in some


open interval containing zero.
Moment generating function

• What is the interpretation of t?


• Note that mX (0) = 1 for any valid MGF. whenever you
compute an MGF, plug in 0 and see if you get 1, as a quick
check!
• A moment generating function, as its name suggests, is a
generating function that encodes the moments of a
distribution.
Moment generating function

• What is the interpretation of t?


• Note that mX (0) = 1 for any valid MGF. whenever you
compute an MGF, plug in 0 and see if you get 1, as a quick
check!
• A moment generating function, as its name suggests, is a
generating function that encodes the moments of a
distribution.

Theorem
Let mX (t) be the moment generating function for a random
variable X . Then
d k mX (t)

= E [X k ].
dt k t=0
Uniform random variable

• We have seen the basic building blocks of discrete


distributions and we now study particular models that
statisticians often encounter in the field.
• The most fundamental of all is the discrete uniform
distribution.
• A random variable X with the discrete uniform distribution on
the integers 1, 2, · · · , n has PMF

f (x) = P(X = x) = 1/n, x = 1, 2, ..., n.

• We write X ∼ Unif (p).

Mean = E (X ) = (n + 1)/2.
Uniform random variable

• We have
n
X (n + 1)(2n + 1)
E (X 2 ) = x 2 f (x) =
6
x=1

and therefore

Variance = V (X ) = σ 2 = E (X 2 ) − (E (X ))2 = (n2 − 1)/12.

• MGF is
e t − e (n+1)t
mX (t) = , t 6= 0
n(1 − e t )
and
mX (t) = 1, t = 0.
Remark

Some authors use following definition for discrete uniform


distribution.
A random variable is said to have discrete uniform distribution if
1
f (x) = for x = x1 , x2 , . . . , xn with n ∈ N fixed.
n
We write X ∼ U(n) or X ∼ U({x1 , x2 , . . . , xn }).

HW: Find mean, variance, mgf of X .


Bernoulli Trials

• A Bernoulli trial is a random experiment in which there are


only two possible outcomes -success and failure.
Bernoulli Trials

• A Bernoulli trial is a random experiment in which there are


only two possible outcomes -success and failure.

Examples
1. Tossing a coin and considering heads as success and tails as
failure.
2. Checking items from a production line: success = not defective,
failure = defective.
3. Phoning a call centre: success = operator free; failure = no
operator free.
Bernoulli Random Variables

• A Bernoulli random variable X takes the values 0 and 1 and

P(X = 1) = p and P(X = 0) = 1 − p.

• E (X ) =
Bernoulli Random Variables

• A Bernoulli random variable X takes the values 0 and 1 and

P(X = 1) = p and P(X = 0) = 1 − p.

• E (X ) = p, V (X ) = p(1 − p), mx (t) = (1 − p) + e t p.


• The number p is called the parameter of the distribution.
We write
X ∼ Bern(p).
Geometric random variable
Geometric random variable
• Suppose that we perform an experiment E and are concerned only
about the occurrence or nonoccurrence of some event A.
• We perform E repeatedly, that the repetitions are independent, and
that on each repetition P(A) = p and P(Ac ) = 1 − p = q remain
the same.
• Define the random variable X as the number of repetitions required
up to and including the first occurrence of A.
• Thus X assumes the possible values 1, 2, · · · Since X = x if and
only if the first (x- 1) repetitions result in Ac while the xth
repetition results in A, we have
P(X = x) = q x−1 p, x = 1, 2, · · ·

• A random variable with the above probability distribution is said to


have a geometric distribution.
• We will describe X in this case by writing X ∼ G (p).
Example

Problem (1)
In Zuarinagar locality the probability that a thunderstorm will
occur on any given day during the summer (say April and May)
equals 0.1 . Assuming independence from day to day, what is the
probability that the first thunderstorm of the summer season
occurs on May 3 ?
Example

Problem (1)
In Zuarinagar locality the probability that a thunderstorm will
occur on any given day during the summer (say April and May)
equals 0.1 . Assuming independence from day to day, what is the
probability that the first thunderstorm of the summer season
occurs on May 3 ?
Ans. Let X be the number of days (starting with April 1 ) until the
first thunderstorm.
We require P(X = 33) which equals (0.9)32 (0.1) = 0.003.
Example

Problem (2)
If the probability that a certain test yields a ”positive” reaction
equals 0.4, what is the probability that fewer than 5 ”negative”
reactions occur before the first positive one?
Example

Problem (2)
If the probability that a certain test yields a ”positive” reaction
equals 0.4, what is the probability that fewer than 5 ”negative”
reactions occur before the first positive one?
Ans. Let X be the number of negative reactions before the first
positive one.
We have
P(X = x) = (0.6)x−1 (0.4), x = 1, 2, · · ·
Hence
X 5
P(x ≤ 5) = (0.6)(x−1) (0.4) = 0.92.
x=1
CDF of geometric random variable

Lemma
Let X be a geometric random variable. Let p be probability of
success and q = 1 − p. Then, the cumulative distribution function
of X is given by

F (x) = 1 − q bxc .
Generating function, Mean, Variance

Theorem
Let X be a geometric random variable with parameter p and
q = 1 − p. Then

mX (t) =
Generating function, Mean, Variance

Theorem
Let X be a geometric random variable with parameter p and
q = 1 − p. Then

pe t
mX (t) = (t < − ln(q)).
(1 − qe t )

E [X ] = 1/p and Var X = q/p 2 .


Problem (3)
Suppose that the cost of performing an experiment is $1000. If the
experiment fails, an additional cost of $300 occurs because of
certain changes that have to be made before the next trial is
attempted. If the probability of success on any given trial is 0.2, if
the individual trials are independent, and if the experiments are
continued until the first successful result is achieved, what is the
expected cost of the entire procedure?
Problem (3)
Suppose that the cost of performing an experiment is $1000. If the
experiment fails, an additional cost of $300 occurs because of
certain changes that have to be made before the next trial is
attempted. If the probability of success on any given trial is 0.2, if
the individual trials are independent, and if the experiments are
continued until the first successful result is achieved, what is the
expected cost of the entire procedure?
Ans. If C is the cost, and X is the number of trials required to
achieve success,

C = 1000X + 300(X − 1) = 1300X − 300.


Problem (3)
Suppose that the cost of performing an experiment is $1000. If the
experiment fails, an additional cost of $300 occurs because of
certain changes that have to be made before the next trial is
attempted. If the probability of success on any given trial is 0.2, if
the individual trials are independent, and if the experiments are
continued until the first successful result is achieved, what is the
expected cost of the entire procedure?
Ans. If C is the cost, and X is the number of trials required to
achieve success,

C = 1000X + 300(X − 1) = 1300X − 300.

Hence

E (C ) = 1300E (X ) − 300 = 1300(1/0.2) − 300 = $6200.


Problem (4)
Let X1 , X2 ∼ Geom(p) and let X1 , X2 be independent. Calculate
PDF of Y := X1 + X2 . Can you do it for n independent variables
with identical geometric distribution?
Problem (4)
Let X1 , X2 ∼ Geom(p) and let X1 , X2 be independent. Calculate
PDF of Y := X1 + X2 . Can you do it for n independent variables
with identical geometric distribution?

Problem (5)
Check memoryless property of geometric distribution.
Problem (4)
Let X1 , X2 ∼ Geom(p) and let X1 , X2 be independent. Calculate
PDF of Y := X1 + X2 . Can you do it for n independent variables
with identical geometric distribution?

Problem (5)
Check memoryless property of geometric distribution. That is,
check that if X ∼ Geom(p). Then

P[X > m + n | X > m] = P[X > n].


Binomial Experiments
Binomial Experiments

• The experiment consists of n repeated Bernoulli trials - each


trial has only two possible outcomes labelled as success and
failure
• The trials are identical and independent, and probability p of
success remains same from trial to trial.
• The random variable X denotes the number of successes
obtained in the n trials.
• The probability of success in each trial is constant which we
denote by p.
Examples

• Consider the experiment where three balls are drawn without


replacement from a box containing 20 red and 40 blue balls,
and the number of red balls drawn is recorded. Is this a
binomial experiment?
Examples

• Consider the experiment where three balls are drawn without


replacement from a box containing 20 red and 40 blue balls,
and the number of red balls drawn is recorded. Is this a
binomial experiment?
Ans. No! The key here is the lack of independence - since the
balls are drawn without replacement, the ball drawn on the
first will affect the probability of later balls.
• A fair die is rolled ten times, and the number of 6’s is
recorded. Is this a binomial experiment?
Examples

• Consider the experiment where three balls are drawn without


replacement from a box containing 20 red and 40 blue balls,
and the number of red balls drawn is recorded. Is this a
binomial experiment?
Ans. No! The key here is the lack of independence - since the
balls are drawn without replacement, the ball drawn on the
first will affect the probability of later balls.
• A fair die is rolled ten times, and the number of 6’s is
recorded. Is this a binomial experiment?
Ans. Yes! There are fixed number of trials (ten rolls), each
roll is independent of the others, there are only two outcomes
(either it’s a 6 or it isn’t), and the probability of rolling a 6 is
constant.
Binomial Distribution
Transition from Geometric to Binomial
• If a family decides to have children (atmost n) until they have
the first girl and then stop, the the number of children in the
family has a Geometric distribution. It would be counting the
number of boys in the family before the first girl was born,
and not the total number of children!
• Suppose the family decides to have n children. The number of
girls (successes) in the family has a binomial distribution.

Definition
Suppose that n independent Bernoulli trials are performed, each
with the same success probability p. Let X be the number of
successes. The distribution of X is called the Binomial distribution
with parameters n and p. We write X ∼ Bin(n, p).
• Bernoulli distribution is a special case of Binomial distribution
Theorem
Let a random variable X has a binomial distribution with
parameters n and p. Then its probability density function is given
by  
n x
f (x) = p (1 − p)n−x x = 0, 1, . . . , n
x
where 0 < p < 1 and n is a positive integer.
We denote f (x) = b(x, n, p).
Theorem
Let a random variable X has a binomial distribution with
parameters n and p. Then its probability density function is given
by  
n x
f (x) = p (1 − p)n−x x = 0, 1, . . . , n
x
where 0 < p < 1 and n is a positive integer.
We denote f (x) = b(x, n, p).
Problem (5)
Let’s consider the experiment where we take a multiple-choice quiz
of four questions with four choices each, and the topic is
something we have absolutely no knowledge of, say- Multivariable
Calculus. If we let X = the number of correct answers, then X is a
binomial random variable. What is the probability of getting
exactly 3 questions correct?
Theorem
Let a random variable X has a binomial distribution with
parameters n and p. Then its probability density function is given
by  
n x
f (x) = p (1 − p)n−x x = 0, 1, . . . , n
x
where 0 < p < 1 and n is a positive integer.
We denote f (x) = b(x, n, p).
Problem (5)
Let’s consider the experiment where we take a multiple-choice quiz
of four questions with four choices each, and the topic is
something we have absolutely no knowledge of, say- Multivariable
Calculus. If we let X = the number of correct answers, then X is a
binomial random variable. What is the probability of getting
exactly 3 questions correct?
Ans. n=4 and p=0.25. We want P(X = 3) = 0.0469.
Example

Problem (6)
A basketball player traditionally makes 85% of her free throws.
Suppose she shoots 10 baskets and counts the number she makes.
What is the probability that she makes less than 8 baskets?
Example

Problem (6)
A basketball player traditionally makes 85% of her free throws.
Suppose she shoots 10 baskets and counts the number she makes.
What is the probability that she makes less than 8 baskets?
Ans. If X = the number of made baskets, it’s reasonable to say the
distribution is binomial.
In this example, n=10 and p=0.85. We want P(X < 8).
P(X < 8) = P(X = 0) + P(X = 1) + ... + P(X = 7).
The probability of making less than 8 baskets is about 0.1798.
Example
Problem (7)
A quality control engineer is in charge of testing whether or not
90% of the DVD players produced by his company conform to
specifications. To do this, the engineer randomly selects a batch of
12 DVD players from each day’s production. The day’s production
is acceptable provided no more than 1 DVD player fails to meet
specifications. Otherwise, the entire day’s production has to be
tested.
(i) What is the probability that the engineer incorrectly passes a
day’s production as acceptable if only 80% of the day’s DVD
players actually conform to specification?
(ii) What is the probability that the engineer unnecessarily requires
the entire day’s production to be tested if in fact 90% of the DVD
players conform to specifications?
Example
Problem (7)
A quality control engineer is in charge of testing whether or not
90% of the DVD players produced by his company conform to
specifications. To do this, the engineer randomly selects a batch of
12 DVD players from each day’s production. The day’s production
is acceptable provided no more than 1 DVD player fails to meet
specifications. Otherwise, the entire day’s production has to be
tested.
(i) What is the probability that the engineer incorrectly passes a
day’s production as acceptable if only 80% of the day’s DVD
players actually conform to specification?
(ii) What is the probability that the engineer unnecessarily requires
the entire day’s production to be tested if in fact 90% of the DVD
players conform to specifications?
Answer:Let X denote the number of DVD players in the sample
that fail to meet specifications.
Example

(i) We want P(X ≤ 1) with binomial parameters n = 12, p = 0.2.


Example

(i) We want P(X ≤ 1) with binomial parameters n = 12, p = 0.2.

P(X ≤ 1) = P(X = 0) + P(X = 1)


   
12 0 12 12
= (0.2) (0.8) + (0.2)1 (0.8)11 = 0.069+0.206 = 0.275.
0 1
Example

(i) We want P(X ≤ 1) with binomial parameters n = 12, p = 0.2.

P(X ≤ 1) = P(X = 0) + P(X = 1)


   
12 0 12 12
= (0.2) (0.8) + (0.2)1 (0.8)11 = 0.069+0.206 = 0.275.
0 1
(ii) We now want P(X > 1) with parameters n = 12, p = 0.1.
Example

(i) We want P(X ≤ 1) with binomial parameters n = 12, p = 0.2.

P(X ≤ 1) = P(X = 0) + P(X = 1)


   
12 0 12 12
= (0.2) (0.8) + (0.2)1 (0.8)11 = 0.069+0.206 = 0.275.
0 1
(ii) We now want P(X > 1) with parameters n = 12, p = 0.1.

P(X ≤ 1) = P(X = 0) + P(X = 1)


   
12 0 12 12
= (0.1) (0.9) + (0.1)1 (0.9)11 = 0.659
0 1
So P(X > 1) = 0.341.
Binomial Distribution-Mean and Variance

• Direct calculation
• Using properties of MGF (and Bernoulli variables)
Binomial Distribution-Mean and Variance

• Direct calculation
• Using properties of MGF (and Bernoulli variables)

Problem
Let X ∼ Bin(n, p) and Y ∼ Bin(m, p) be two independent
variables. Prove that X + Y is a binomial variable.
There are two ways to solve it.
Way I: Direct checking.
Way II: Using MGF (which we would discuss now in detail).
Binomial Distribution-Moment Generating Function

Let X be a binomial variable with parameters n and p.


The moment generating function for X is given by
n  
X
tx n x
MX (t) = e p (1 − p)n−x
x
x=0

n  
X n
= (pe t )x (1 − p)n−x ,
x
x=0

and hence
MX (t) = (1 − p + pe t )n .
Theorem
Let X be a random variable with moment generating function mX (t). Let
Y = α + βX with α, β ∈ R. Then, the moment generating function of Y
is
mY (t) = e αt mX (βt).
Theorem
Let X be a random variable with moment generating function mX (t). Let
Y = α + βX with α, β ∈ R. Then, the moment generating function of Y
is
mY (t) = e αt mX (βt).

Theorem
Let X1 and X2 be independent random variables with moment generating
functions mX1 (t) and mX2 (t) respectively. Let Y = X1 + X2 . The
moment generating of Y is

mY (t) = mX1 (t)mX2 (t).


Theorem
Let X be a random variable with moment generating function mX (t). Let
Y = α + βX with α, β ∈ R. Then, the moment generating function of Y
is
mY (t) = e αt mX (βt).

Theorem
Let X1 and X2 be independent random variables with moment generating
functions mX1 (t) and mX2 (t) respectively. Let Y = X1 + X2 . The
moment generating of Y is

mY (t) = mX1 (t)mX2 (t).

Theorem
Let X and Y be random variables with moment generating
functions mX (t) and mY (t) respectively. If mX (t) = mY (t) for all t
in some open interval about 0, then X and Y have same
distribution.
Coming back to the problem....

We are given
X ∼ Bin(n, p), Y ∼ Bin(m, p)
Hence,

mX (t) = (q + pe t )n and mY (t) = (q + pe t )m ,


Coming back to the problem....

We are given
X ∼ Bin(n, p), Y ∼ Bin(m, p)
Hence,

mX (t) = (q + pe t )n and mY (t) = (q + pe t )m ,

where q = 1 − p. Since X and Y are independent,

mX +Y (t) = mX (t)mY (t) = (q + pe t )m+n .

We know that if Z ∼ Bin(m + n, p) then mZ (t) = (q + pe t )m+n .


Hence, we conclude that Z = X + Y . That is,

X + Y ∼ Bin(m + n, p).
Bernoulli ; Binomial

• Any random variable with a binomial distribution X with


parameters n and p is a sum of n independent Bernoulli
random variables in which the probability of success is p.

X = X1 + X2 + · · · + Xn .

• The mean and variance of each Xi can easily be calculated as:

E (Xi ) = p, V (Xi ) = p(1 − p).

• Hence the mean and variance of X are given by (remember


the Xi are independent)

E (X ) = np, V (X ) = np(1 − p).


Limitation...

Using moment generating function is a very powerful tool but


unfortunately MGF does not always exist so we can not use
MGF every time even if we are dealing with independent
variables.
Example

Problem (8)
Bits are sent over a communications channel in packets of 12. If
the probability of a bit being corrupted over this channel is 0.1 and
such errors are independent, what is the probability that no more
than 2 bits in a packet are corrupted? If 6 packets are sent over
the channel, what is the probability that at least one packet will
contain 3 or more corrupted bits? Let X denote the number of
packets containing 3 or more corrupted bits. What is the
probability that X will exceed its mean by more than 2 standard
deviations?
Ans. Y denotes the number of corrupted bits in a packet. Then in
the first question, we want

P(Y ≤ 2) = P(y = 0) + P(y = 1) + P(y = 2).


But  
12
P(Y = 0) = (0.1)0 (0.9)12 = 0.282
0
 
12
P(Y = 1) = (0.1)1 (0.9)11 = 0.377
1
 
12
P(Y = 2) = (0.1)2 (0.9)10 = 0.23.
2
Therefore
P(Y ≤ 2) = 0.282 + 0.377 + 0.23 = 0.889.
The probability of a packet containing 3 or more corrupted bits is
1-0.889 = 0.111.
But  
12
P(Y = 0) = (0.1)0 (0.9)12 = 0.282
0
 
12
P(Y = 1) = (0.1)1 (0.9)11 = 0.377
1
 
12
P(Y = 2) = (0.1)2 (0.9)10 = 0.23.
2
Therefore
P(Y ≤ 2) = 0.282 + 0.377 + 0.23 = 0.889.
The probability of a packet containing 3 or more corrupted bits is
1-0.889 = 0.111.
Let X be the number of packets containing 3 or more corrupted
bits. X can be modelled with a binomial distribution with
parameters n = 6, p = 0.111. The probability that at least one
packet will contain 3 or more corrupted bits is:
 
6
1 − P(X = 0) = 1 − (0.111)0 (0.889)6 = 0.494.
0
Example

The mean
p of X is µ = 6(0.111) = 0.666 and its standard deviation
is σ = 6(0.111)(0.889) = 0.77. So the probability that X
exceeds its mean by more than 2 standard deviations is
P(X − µ > 2σ) = P(X > 2.2).
As X is discrete, this is equal to P(X ≥ 3).
 
6
P(X = 1) = (0.111)1 (0.889)5 = 0.37.
1
 
6
P(X = 2) = (0.111)2 (0.889)4 = 0.115.
2
So
P(X ≥ 3) = 1 − (.506 + .37 + .115) = 0.009.
Textbook Exercises

Problem (9)
Twenty percent of all telephones of a certain type are submitted
for service while under warranty. Of these, 60% can be repaired,
whereas the other 40% must be replaced with new units. If a
company purchases ten of these telephones, what is the probability
that exactly two will end up being replaced under warranty?
Textbook Exercises

Problem (9)
Twenty percent of all telephones of a certain type are submitted
for service while under warranty. Of these, 60% can be repaired,
whereas the other 40% must be replaced with new units. If a
company purchases ten of these telephones, what is the probability
that exactly two will end up being replaced under warranty?
Ans:

p = P(Replaced) = P(Replaced|Submitted)P(Submitted) = (.4)(.2) = .08.

B(2, 10, 0.08) = 0.1417


Textbook Exercises

Problem (10)
A toll bridge charges 1.00 for passenger cars and 2.50 for other
vehicles. Suppose that during daytime hours, 60% of all vehicles
are passenger cars. If 25 vehicles cross the bridge during a
particular daytime period, what is the resulting expected toll
revenue?
Textbook Exercises

Problem (10)
A toll bridge charges 1.00 for passenger cars and 2.50 for other
vehicles. Suppose that during daytime hours, 60% of all vehicles
are passenger cars. If 25 vehicles cross the bridge during a
particular daytime period, what is the resulting expected toll
revenue?
Ans. T= X+ (2.5)(25- X) where X ∼ Bin(25, 0.6).
So, we have that E[T]= 40.
Problem (11)
The number of eggs X , laid by the female tawny owl has a probability
distribution as follows
X 2 3 4
P[X= x] 0.1 0.2 0.7

For any egg the probability that it is hatched is 0.8, independently of all
other eggs. Let Y denote the number of hatched eggs in a randomly
chosen nest. Then obtain PDF for Y .
Problem (11)
The number of eggs X , laid by the female tawny owl has a probability
distribution as follows
X 2 3 4
P[X= x] 0.1 0.2 0.7

For any egg the probability that it is hatched is 0.8, independently of all
other eggs. Let Y denote the number of hatched eggs in a randomly
chosen nest. Then obtain PDF for Y .
Idea: Observed values for Y = 0, 1, 2, 3, 4.
     
2 0 2 3 0 3
P[Y = 0] = (.1) (.8) (.2) + (.2) (.8) (.2)
0 0
  
4
+ (.7) (.8)0 (.2)4
0
and so on....
Problem (11)
The number of eggs X , laid by the female tawny owl has a probability
distribution as follows
X 2 3 4
P[X= x] 0.1 0.2 0.7

For any egg the probability that it is hatched is 0.8, independently of all
other eggs. Let Y denote the number of hatched eggs in a randomly
chosen nest. Then obtain PDF for Y .
Idea: Observed values for Y = 0, 1, 2, 3, 4.
     
2 0 2 3 0 3
P[Y = 0] = (.1) (.8) (.2) + (.2) (.8) (.2)
0 0
  
4
+ (.7) (.8)0 (.2)4
0
and so on....
Problem (12)
For what value of p is V(X) maximized?
Ans.: 1/2.
Textbook Exercise

Problem (13)
A student who is trying to write a paper for a course has a choice
of two topics, A and B. If topic A is chosen, the student will order
two books through interlibrary loan, whereas if topic B is chosen,
the student will order four books. The student believes that a
good paper necessitates receiving and using at least half the books
ordered for either topic chosen. If the probability that a book
ordered through interlibrary loan actually arrives in time is .9 and
books arrive independently of one another, which topic should the
student choose to maximize the probability of writing a good
paper? What if the arrival probability is only .5 instead of .9?
Textbook Exercise

Problem (13)
A student who is trying to write a paper for a course has a choice
of two topics, A and B. If topic A is chosen, the student will order
two books through interlibrary loan, whereas if topic B is chosen,
the student will order four books. The student believes that a
good paper necessitates receiving and using at least half the books
ordered for either topic chosen. If the probability that a book
ordered through interlibrary loan actually arrives in time is .9 and
books arrive independently of one another, which topic should the
student choose to maximize the probability of writing a good
paper? What if the arrival probability is only .5 instead of .9?
Ans: We are interested in calculating P[at least half books
received] in each case.
Textbook Exercise

Problem (13)
A student who is trying to write a paper for a course has a choice
of two topics, A and B. If topic A is chosen, the student will order
two books through interlibrary loan, whereas if topic B is chosen,
the student will order four books. The student believes that a
good paper necessitates receiving and using at least half the books
ordered for either topic chosen. If the probability that a book
ordered through interlibrary loan actually arrives in time is .9 and
books arrive independently of one another, which topic should the
student choose to maximize the probability of writing a good
paper? What if the arrival probability is only .5 instead of .9?
Ans: We are interested in calculating P[at least half books
received] in each case.
(a) Topic B, (b) Topic A
Hypergeometric random variable

• The experiment consists of drawing a random sample of size n


without replacement and without regard to order from a
collection of N objects.
• Of the N objects, r have a trait of interest to us; the other
N − r do not have the trait.
• The random variable X is the number of objects in the sample
with the trait.
Definition
A random variable X has a hypergeometric distribution with
parameters N, n and r if its density is given by
r N−r
 
x n−x
f (x) := N

n

where max[0, n − (N − r )] ≤ x ≤ min(n, r ) and N, r and n are


positive integers.

Remember:
  XN   
N r N −r
= .
n x n−x
x=0
Theorem
Let X be a hypergeometric variable with parameters N, n and r .
Then,
• E [X ] = n Nr ,


• Var X = n Nr N−r
  N−n 
N N−1 .
Theorem
Let X be a hypergeometric variable with parameters N, n and r .
Then,
• E [X ] = n Nr ,


• Var X = n Nr N−r
  N−n 
N N−1 .

Hint: If X follows hypergeometric distribution with parameter n, r


and N then show that E (X k ) = nr
N E [(Y + 1)
k−1 ] where Y is

hypergeometric with parameter n − 1, r − 1, N − 1. OR


Calculate E [X (X − 1)].
Theorem
Let X be a hypergeometric variable with parameters N, n and r .
Then,
• E [X ] = n Nr ,


• Var X = n Nr N−r
  N−n 
N N−1 .

Hint: If X follows hypergeometric distribution with parameter n, r


and N then show that E (X k ) = nr
N E [(Y + 1)
k−1 ] where Y is

hypergeometric with parameter n − 1, r − 1, N − 1. OR


Calculate E [X (X − 1)].

Fact (Approximation to binomial distribution)


Hypergeometric distribution tends to binomial distribution as
N → ∞ and Nr → p.
Problem
Small electric motors are shipped in lots of 50. Before such a
shipment is accepted, an inspector chooses 5 of these motors and
inspects them. If none of these tested motors are defective, the lot
is accepted. If one or more are found to be defective, the entire
shipment is inspected. Suppose that there are, in fact, three
defective motors in the lot. What is the probability that 100
percent inspection is required?
Problem
Small electric motors are shipped in lots of 50. Before such a
shipment is accepted, an inspector chooses 5 of these motors and
inspects them. If none of these tested motors are defective, the lot
is accepted. If one or more are found to be defective, the entire
shipment is inspected. Suppose that there are, in fact, three
defective motors in the lot. What is the probability that 100
percent inspection is required?
Ans. If we let X be the number of defective motors found, 100
percent inspection will be required if and only if X ≥ 1. Hence
3
 47
0 . 5
P(X ≥ 1) = 1 − P(X = 0) = 1 − 50
 .
5
Problem
An urn contains 4 white and 4 black balls. We randomly choose 4
balls. If 2 of them are white and 2 are black, we stop. If not, we
replace the balls in the urn and again randomly select 4 balls. This
continues until exactly 2 of the 4 chosen are white. What is the
probability that we shall make exactly three selections?
Problem
An urn contains 4 white and 4 black balls. We randomly choose 4
balls. If 2 of them are white and 2 are black, we stop. If not, we
replace the balls in the urn and again randomly select 4 balls. This
continues until exactly 2 of the 4 chosen are white. What is the
probability that we shall make exactly three selections?
Ans.: Let X denote number of trials/selections. Then
(4)(4)
X ∼ Geom(p) where p = 2 8 2 .
(4)
We want to calculate P[X = 3].
Problem
A slitter assembly contains 48 blades. Each day, five blades are
selected at random and evaluated for sharpness. If any dull blade is
found, the assembly is replaced with newly sharpened set of blades
else all five blades are put back in the assembly. If 10 blades in an
assembly are dull, what is the probability that the assembly is not
replaced until the third day of evaluation?
Idea: Let Y denote number of days needed for replacement of the
assembly (for the first time). Then Y ∼ Geom(p) where
(10)(38)
p = P[At least one dull blade] = 1−P[No dull blade] = 1− 0 48 5 .
(5)
We are interested in calculating P[Y = 3].
Problem
A purchaser of electrical components buys them in lots of size 10.
It is his policy to inspect 3 components randomly from a lot and to
accept the lot only if all 3 are nondefective. If 30 percent of the
lots have 4 defective components and 70 percent have only 1,
what proportion of lots does the purchaser reject?
Idea:
 4 6   1 9 
P[Accept] = (0.3) 0 103 + (0.7) 0 103 .
3 3
So, (1- P[Accept])100% gives proportion of lots that the purchaser
rejects.
Negative Binomial Distribution
• The experiment consists of a sequence of independent trials.
• Each trial can result in either a success(S) or a failure(F ).
• The probability of success is constant from trial to trial.
• The experiment continues (trials are performed) until a total
of r successes have been observed, where r is a specified
positive integer.
• Let f (x; r , p) denote the probability that there are x failures
preceding the r th success in x + r trials.
• The rv X = the number of failures that precede the r th
success
• Now, the last trial must be success, whose probability is p.
• In the remaining (x + r − 1) trials we  must have (r − 1)
−1 r −1 x
success whose probability is x+r p q
 r −1
x+r −1 r −1 x
• Hence, f (x; r , p) = r −1 p q p
• Remember: (1 − x)−r = ∞ i+r −1 i
P 
i=0 i x.
Negative Binomial Distribution Continued
Definition
A rv X is said to follow a negative binomial distribution with
parameters r and p if its pmf is given by
−1 r x
f (x; r , p) = x+r
r −1 p q , x = 0, 1, 2, · · ·
Negative Binomial Distribution Continued
Definition
A rv X is said to follow a negative binomial distribution with
parameters r and p if its pmf is given by
−1 r x
f (x; r , p) = x+r
r −1 p q , x = 0, 1, 2, · · ·
Alternate Definition
• The rv X denote the trial number on which the r th
success occurs. Here, r is the positive integer greater than or
equal to one.
• This is equivalent to saying that the rv X denotes the number
of trials needed to observe the r th success.

Definition
A rv X is said to have a negative binomial distribution with
parameters p and r if its density f is given by
f (x; r , p) = x−1 r x−r , r = 1, 2, · · · , x = r , r + 1, r + 2, . . .
r −1 p q
For us, X ∼ NBin(r , p) means X
is a random variable which
counts number of trials needed
for obtaining r -th success.
Mean, variance and mfg

Theorem
Let X be a negative binomial rv (where X denotes the number of
trials needed to obtain the r -th success) with parameters r and p.
Then
(pe t )r
• mX (t) = (1−qe t )r , t < − ln(1 − p).

• E [X ] = r
p
• V (X ) = prq2
Negative Binomial Distribution
• Like a Binomial r.v. can be represented as a sum of rvs of
Bernoullis, a Negative Binomial r.v. can be represented as a
sum of rvs of Geometrics.
Justification1: Let X1 be the number of trials until the first
success, X2 be the number of trials between the first success
and the second success. In general, let Xi be the number of
trials between the (i − 1)th success and the i th success.
Note that X1 ∼ Geom(p), X2 ∼ Geom(p), and similarly for all
the Xi . Furthermore, the Xi are independent, because the
trials are all independent of each other. Adding the Xi , we get
the total number of trials until (including) the r th success,
which is X .
Justification2: Let Xi ∼ Geom(p) for i = 1, . . . , r such that
Pr
Xi s are independent then Xi ∼ NBin(r , p).
i=1
Example

Problem
The probability that an experiment will succeed is 0.8. If the
experiment is repeated until four successful outcomes have
occurred, what is the expected number of failure required?
Example

Problem
The probability that an experiment will succeed is 0.8. If the
experiment is repeated until four successful outcomes have
occurred, what is the expected number of failure required?

Problem
What is the average number of times one must throw a die until
the outcome ’1’ has occurred 4 times?
Ans.: We are interested in expected number of trials until the
outcome ’1’ has occured.
Problem
Suppose a small factory has 10 machines in it. Probability of any
machine producing defective item in a trial is 0.1. An inspector
rejects a machine if it produces 5th defective item in at most 7
trials. Find the probability that exactly two machines would be
rejected.
Problem
Suppose a small factory has 10 machines in it. Probability of any
machine producing defective item in a trial is 0.1. An inspector
rejects a machine if it produces 5th defective item in at most 7
trials. Find the probability that exactly two machines would be
rejected.

Problem
Suppose an experiment consists of tossing a fair coin until four
heads occur. What is the probability that the experiment ends
after exactly seven flips of the coin with a head on the sixth toss as
well as on the seventh?
Problem
Consider an experiment consists of independent Bernoulli trials
where probability of success remains constant at each trial. Let X
denote number of failures required to obtain r -th success.
Calculate mean, variance and moment generating function for X .

rq rq pr
E [X ] = , V [X ] = , mX (t) = .
p p2 (1 − qe t )r
Definition
A random variable X is said to have a Poisson distribution with
parameter k if its density f is given by

e −k k x
f (x) := k > 0, x = 0, 1, 2, . . .
x!
Steps in solving a Poisson problem

1. Determine basic unit of measurement being used.


2. Determine λ := average number of occurrences of the event
per unit.
3. Determine s := length or size of the observation period.
4. The random variable X , the number of occurrences of the
event in the interval of size s follows a Poisson distribution
with parameter k = λs.
Problem
A loom experience one yarn breakage approximately every 10
hours. A particular style of cloth is being produced that will take
25 hours on this loom. If 3 or more breaks are required to render
the product unsatisfactory, find the probability that the style of
cloth is finished with acceptable quality.
Problem
A loom experience one yarn breakage approximately every 10
hours. A particular style of cloth is being produced that will take
25 hours on this loom. If 3 or more breaks are required to render
the product unsatisfactory, find the probability that the style of
cloth is finished with acceptable quality.
Answer: In this case
1
λ= , s = 25, k = λs = 2.5.
10
Let X = number of breaks. Then we want to calculate

P[X < 3] = P[X = 0] + P[X = 1] + P[X = 2]


(2.5)0 (2.5)1 (2.5)2
 
= e −2.5 + + .
0! 1! 2!
Theorem
Let X be a Poisson random variable with parameter k. Then,
• The moment generating function for X is given by

mX (t) =
Theorem
Let X be a Poisson random variable with parameter k. Then,
• The moment generating function for X is given by
t −1)
mX (t) = e k(e .

• E [X ] = k and Var X = k.

Problem
Let X1 and X2 be two independent Poisson variables with
parameters k1 and k2 respectively. Determine distribution of
X1 + X2 .
Problem
Suppose that the number of telephone calls coming into a
telephone exchange between 10 A.M. to 11 A.M., say, X1 , is a
random variable with Poisson distribution with parameter 2.
Similarly, the number of calls arriving between 11 A.M. and 12
noon, say, X2 , has a Poisson distribution with parameter 6. If X1
and X2 are independent, what is the probability that more than 5
calls come in between 10 A.M. to 12 noon ?
Problem
Suppose that the number of telephone calls coming into a
telephone exchange between 10 A.M. to 11 A.M., say, X1 , is a
random variable with Poisson distribution with parameter 2.
Similarly, the number of calls arriving between 11 A.M. and 12
noon, say, X2 , has a Poisson distribution with parameter 6. If X1
and X2 are independent, what is the probability that more than 5
calls come in between 10 A.M. to 12 noon ?
Idea: Y = X1 + X2 ∼ Poisson(2 + 6).
We want to calculate P[Y > 5].
Problem
A company rents out time on a computer for periods of t hours,
for which it receives Rs. 600/- an hour. The number of times the
computer breaks down during t hours is a r.v. having the Poisson
distribution with k = 0.8t and if the computer breaks down x
times during t hours it costs Rs. 50x 2 to fix it. How should the
company select t in order to maximize its expected profit.
Problem
A company rents out time on a computer for periods of t hours,
for which it receives Rs. 600/- an hour. The number of times the
computer breaks down during t hours is a r.v. having the Poisson
distribution with k = 0.8t and if the computer breaks down x
times during t hours it costs Rs. 50x 2 to fix it. How should the
company select t in order to maximize its expected profit.
Idea:
Profit= 600t– cost if computer breaks down. So, if we denote
expected profit by φ(t), we have

φ(t) = 600t − 50E [X 2 ] = 560t − 32t 2 .

We see that φ(t) attains maximum at t = 8.75.


Problem
Assume that cars pass under a bridge at a rate of 100 per hour
according to a Poisson process.
(i) Find the time interval such that the probability that no car will
pass during this interval is at least 0.25.
(ii) Find the probability that during a 3 minute period no cars will
pass under the bridge ?
Problem
Assume that cars pass under a bridge at a rate of 100 per hour
according to a Poisson process.
(i) Find the time interval such that the probability that no car will
pass during this interval is at least 0.25.
(ii) Find the probability that during a 3 minute period no cars will
pass under the bridge ?
Idea:(i) 100 cars in 60 mins so k = 5t
3 cars in t minutes.
We want t such that P[X = 0] ≥ 0.25.
Direct calculation ; t ≤ (1.3863)×3
5 .
Problem
Assume that cars pass under a bridge at a rate of 100 per hour
according to a Poisson process.
(i) Find the time interval such that the probability that no car will
pass during this interval is at least 0.25.
(ii) Find the probability that during a 3 minute period no cars will
pass under the bridge ?
Idea:(i) 100 cars in 60 mins so k = 5t
3 cars in t minutes.
We want t such that P[X = 0] ≥ 0.25.
Direct calculation ; t ≤ (1.3863)×3
5 .
(ii) t = 3 ∴ k = 5.

e −5 k 0
P[X = 0] = = e −5 .
0!
Problem
Suppose that probability that an item produced by a certain
machine will be defective is 0.1. Find the probability that a
sample of 40 items will contain at most one defective item.

Fact (Bimonial ⇒ Poisson)


When n → ∞ and p → 0, while np = k remains constant, the
limiting form of the binomial distribution is the Poisson
distribution.
Problem
Let X be a Poisson variable with E [X ] = ln(2). The value of
E (cos(πx)) is
(i)0.25 (ii) 23
(iii) 0.5 (iv) NOTA
Problem
Let X be a Poisson variable with E [X ] = ln(2). The value of
E (cos(πx)) is
(i)0.25 (ii) 23
(iii) 0.5 (iv) NOTA
Answer: (i) because

X e −k k x
E [cos(πx)] = (−1)x = e −2k = e −2 ln(2)
= 2−2 = 0.25.
x!
x=0
Problem
Let X be a Poisson variable with E [X ] = ln(2). The value of
E (cos(πx)) is
(i)0.25 (ii) 23
(iii) 0.5 (iv) NOTA
Answer: (i) because

X e −k k x
E [cos(πx)] = (−1)x = e −2k = e −2 ln(2)
= 2−2 = 0.25.
x!
x=0

Problem
Let f (X ) be a PDF of a random variable X satisfying f (x) = 0 for
all x ≤ 3, then
(i) E [X ] > E [X 2 ] (ii) E [X ] = E [X 2 ]
(iii) E [X 2 ] > E [X ] (iv) NOTA
Problem
Let X be a Poisson variable with E [X ] = ln(2). The value of
E (cos(πx)) is
(i)0.25 (ii) 23
(iii) 0.5 (iv) NOTA
Answer: (i) because

X e −k k x
E [cos(πx)] = (−1)x = e −2k = e −2 ln(2)
= 2−2 = 0.25.
x!
x=0

Problem
Let f (X ) be a PDF of a random variable X satisfying f (x) = 0 for
all x ≤ 3, then
(i) E [X ] > E [X 2 ] (ii) E [X ] = E [X 2 ]
(iii) E [X 2 ] > E [X ] (iv) NOTA
Answer: (iii) as X < X 2 for X > 3.
Problem
If X is a Poisson random variable such that
P[X = 2] = 9P[X = 4] + 90P[X = 6], then find the variance of X .
Problem
If X is a Poisson random variable such that
P[X = 2] = 9P[X = 4] + 90P[X = 6], then find the variance of X .
Idea: Let k be parameter of the distribution. We have

k 4 + 3k 2 − 4 = 0.

Hence, k = 1. So,

Var (X ) = k = 1.
Problem
The number of times that an individual contracts a cold in a given year is
a Poisson process with parameter λ = 3 per year. Suppose a new drug
just has been marketed that reduces the Poisson parameter to λ = 2 per
year for 75 percent of the population. For other 25 percent of the
population, the drug has no appreciable effect on colds. If an individual
tries the drug for a year and has 0 colds in that time, how likely is it that
the drug is beneficial for him/her ?
Problem
The number of times that an individual contracts a cold in a given year is
a Poisson process with parameter λ = 3 per year. Suppose a new drug
just has been marketed that reduces the Poisson parameter to λ = 2 per
year for 75 percent of the population. For other 25 percent of the
population, the drug has no appreciable effect on colds. If an individual
tries the drug for a year and has 0 colds in that time, how likely is it that
the drug is beneficial for him/her ?
Ans:

A1 ; Event that drug was beneficial for an individual.


A2 ; Event that drug was not beneficial for an individual.
B ; Event that individual has zero cold in the year.

We are given that P[A1 ] = .75 and P[A2 ] = .25. We want P[A1 | B].
Problem
The number of times that an individual contracts a cold in a given year is
a Poisson process with parameter λ = 3 per year. Suppose a new drug
just has been marketed that reduces the Poisson parameter to λ = 2 per
year for 75 percent of the population. For other 25 percent of the
population, the drug has no appreciable effect on colds. If an individual
tries the drug for a year and has 0 colds in that time, how likely is it that
the drug is beneficial for him/her ?
Ans:

A1 ; Event that drug was beneficial for an individual.


A2 ; Event that drug was not beneficial for an individual.
B ; Event that individual has zero cold in the year.

We are given that P[A1 ] = .75 and P[A2 ] = .25. We want P[A1 | B].

P[B | A1 ]P[A1 ] e −2 (.75)


P[A1 | B] = = −2 .
P[B | A1 ]P[A1 ] + P[B | A2 ]P[A2 ] e (.75) + e −3 (.25)
Problem
A manager accepts the work submitted by his typist only when there is
no mistake in the work. The typist has to type on an average 20 letters
per day about 200 words each. Find the chance of her making a mistake
(i) if less than 1% of the letters submitted by her are rejected.
(ii) if on 90% days all the letters submitted by her are accepted.
Problem
A manager accepts the work submitted by his typist only when there is
no mistake in the work. The typist has to type on an average 20 letters
per day about 200 words each. Find the chance of her making a mistake
(i) if less than 1% of the letters submitted by her are rejected.
(ii) if on 90% days all the letters submitted by her are accepted.
Idea:(i) Let X denote number of mistakes per letter. Then
X ∼ Poisson(k) with P[X = 0] ≥ 0.99.
Hence, k ≤ 0.1005. We see that k = 200λ where λ is chance of her
making mistake (in a word).
Problem
A manager accepts the work submitted by his typist only when there is
no mistake in the work. The typist has to type on an average 20 letters
per day about 200 words each. Find the chance of her making a mistake
(i) if less than 1% of the letters submitted by her are rejected.
(ii) if on 90% days all the letters submitted by her are accepted.
Idea:(i) Let X denote number of mistakes per letter. Then
X ∼ Poisson(k) with P[X = 0] ≥ 0.99.
Hence, k ≤ 0.1005. We see that k = 200λ where λ is chance of her
making mistake (in a word).
(ii) Using similar argument as previous part, we see that-the day’s work is
of 20 letters of 200 words each is accepted when there is no mistake in
any of the 20 × 200 = 4000 words. If λ denote chance of making mistake
in a word then we are given

e −4000λ = 0.90 =⇒ λ = 2.634 × 10−5 .


Problem
The probability of a man hitting a target is 14 . How many times
must he fire so that the probability of his hitting the target at least
once is greater than 43 ?

You might also like