0% found this document useful (0 votes)
58 views

AMA2104 Probability and Engineering Statistics 3 Joint Distribution

1. The document discusses joint probability distributions for discrete and continuous random variables, including definitions, examples, and properties. 2. It also covers topics like marginal distributions, conditional distributions, independence of random variables, and moment generating functions. 3. Key concepts are illustrated with examples like tossing dice, finding the probability of the maximum value obtained, and defining joint probability density functions.

Uploaded by

YH CHENG
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
58 views

AMA2104 Probability and Engineering Statistics 3 Joint Distribution

1. The document discusses joint probability distributions for discrete and continuous random variables, including definitions, examples, and properties. 2. It also covers topics like marginal distributions, conditional distributions, independence of random variables, and moment generating functions. 3. Key concepts are illustrated with examples like tossing dice, finding the probability of the maximum value obtained, and defining joint probability density functions.

Uploaded by

YH CHENG
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

AMA2104 Probability and Engineering Statistics

3 Joint Distribution

Dr Bob He

1
Joint Distributions, discrete case

Definition
The function f(x, y) is a joint probability mass function of the discrete
random variables X and Y if
1. J ( x, y) > 0 for a II ( x, y)
2.

X y

3. P(X = x, Y = y) = f(x, y), where P(A, B) means P(A and B).


For any regin n in the x-y plane,
P((X, Y) En) = L J(x, y).
(x,y)EO

The definition can be extended to three or more random variables, e.g.,


J(x1, X2, ... , X n ),
2
Example
We toss a die twice and let X and Y denote the number obtained
respectively. Find
0 IE[max(X, Y)], Var(max(X, Y))
G IE[X + Y], Var(X + Y)
We use the table to
summarize the
probability of
max(X, Y) taking
different values.
'X:.Y 1 2 3 4 5 6
1 1 2 3 4 5 6
2 2 2 3 4 5 6
3 3 3 3 4 5 6
4 4 4 4 4 5 6
5 5 5 5 5 5 6
6 6 6 6 6 6 6

3
Joint Distributions, continuous case

Definition
The function f(x, y) is a joint probability density function of the
continuous random variables X and Y if
1. f ( x, y) > 0 for aII ( x, y).
2.

1_: 1_: J(x, y)dxdy = 1.


3.

P((X, Y) En) = l f(x, y)dxdy,

for any region n in the x-y plane.


The definition can be extended to three or more random variables, e.g.,
f(x1, X2, ... , X n )-
4
Example
Let X and Y be two continuous random variables having the joint probability
density function

f ( X' y) ={ ¾ ( 2x
0,
+ 3y), 0 � X � 1, 0 � y � 1,
otherwise.

Verify that f is a joint PDF. Find P((X, Y) EA), where


½,
A= { (x, y) : 0 < x < ¼ < y < ½ }.

5
Double Integrals

Assuming that the region in R2 is a rectangle which we will denote


as follows, R = [a, b] × [c, d]. We will first divide up a 6 x 6 b
into n subintervals and divide up c 6 y 6 d into m subintervals.
This will divide up R into a series of smaller rectangles and from
each of these we will choose a point (xi , yj ).

6
Over each
of these smaller rectangles we will construct
a box whose height is given by f (xi , yj ).
Each of the rectangles has a base area of ∆A
and a height of f (xi , yj ) so the volume of each
of these boxes is f (xi , yj )∆A. The volume
under the function is then approximately,
n X
X m
V ≈ f (xi , yj ) ∆A.
i=1 j=1

To get a better estimation of the volume we


will take n and m larger and larger and to get the exact volume we
will need to take the limit.
n X
X m
V = lim f (xi , yj ) ∆A
n,m→∞
i=1 j=1

7
Definition
We define the double integral of the function f (x, y ) over the
rectangle R to be the limit
ZZ ZZ n X
X m
f (x, y )dA = f (x, y )dxdy = lim f (xi , yj ) ∆A.
R R n,m→∞
i=1 j=1

Iterated Integrals

ZZ Z b Z d Z d Z b
f (x, y )dA = f (x, y )dydx = f (x, y )dxdy
R a c c a

8
Bounded Irregular Region

Theorem
Suppose that f (x, y ) is continuous on the region R.
1 If R is a vertically simple region a 6 x 6 b,
y1 (x) 6 y 6 y2 (x), then
!
ZZ Z b Z y2 (x)
f (x, y )dA = f (x, y )dy dx.
R a y1 (x)

2 If R is a horizontally simple region c 6 y 6 d,


x1 (y ) 6 x 6 x2 (y ), then
!
ZZ Z d Z x2 (y )
f (x, y )dA = f (x, y )dx dy .
R c x1 (y )

9
Finding The Limits of Integration

1 Sketch the region of integration and label the bounding


curves.
2 Imagine a vertical line L cutting through R in the direction of
increasing y . Mark the y -values where L enters and leaves.
These are the y -limits of integration and usually functions of
x.

3 Choose x-limits that include all the vertical lines through R.


10
Example
The joint density for the random variables (X , Y ), where X is the temperature
change and Y is the proportion of the spectrum that shifts for a certain atomic
particle, is
(
10xy 2 , for 0 < x < y < 1,
f (x, y ) =
0 elsewhere.

11
Marginal distributions

Definition
The marginal distributions of X alone and Y alone are:
• For the discrete case,

g(x) = L f(x, y), and h(y) = L f(x, y)


X

• For the continuous case,

g(x) = 1-: f(x, y)dy, and h(y) = 1-: f(x, y)dx.

12
Example
Suppose that X and Y are two discrete random variables and their joint
probability mass function f(x, y) is given by the table below.

X
f(x,y) Row Totals
0 1 2
0 0.13 0.05 0.05 0.23
y 1 0.23 0.21 0.07 0.51
2 0.15 0.03 0.08 0.26
Column Totals 0.51 0.29 0.20 1.00

f(x,y) --1�-2
-0
- Row Totals
0.05 0.05
y

f(x,y) h(y)
g(x)

13
Example
Let X and Y be two continuous random variables having the joint
probability density function

f(x,y) = { t,c2x + 3y), Q :::; X :::; 1, Q :::; y :::; 1,


otherwise.

Find the marginal density functions g(x) and h(y).

14
Definition
Let X and Y be two random variables, discrete or continuous. The
conditional distribution of the random variable Y given that X = x is

f(x,y)
f(yIx) = provided that g(x) > 0.
g(x) '

Definition
Let X1 , ... , Xn be n random variables, discrete or continuous, with joint
probability distribution f(x1, ... ,xn ) and marginal distributions fi(x 1 ),
... , fn (xn ) respectively. Then we say that the random variables
X1 , ... , Xn are (mutually) independent if

Facts
Let X and Y be two independent random variables. Then

IE[XY] = IE[X]IE[Y].
15
Example
Let X and Y be two continuous random variables having the joint probability
density function
(
2
5
(2x + 3y ), 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
f (x, y ) =
0, otherwise.

Find E[X ], E[Y ], and E[XY ].

16
17
Moment Generating Functions (MGF's)
Definition
The moment generating function of a random variable X is a function
Mx(s) defined by

Example
Suppose X ~ Poisson(.X), find the MGF of X.

18
Theorem (properties of MGF)
Let random variables X and Y be independent. Let Mx(s) and My(s)
be th eir MGF respectively. Assume that they are finite at an open
interval that covers 0.
1. Mi_m ) (0) = IE(X m ) for every positive integer m.
2. Mx+y(s) Mx(s)My(s)
3. If Mx(s) My(s), then X and Y have the same distribution.
4. Max+b(s) = e bsMx(as) for any constants a and b.

19
20
Example
~ ~
Suppose X1 Poisson(.\1), X2 Poisson(.\2), ... , and
~
Xm Poisson(.\m ) are independent. Let Y = X1 + · · · + Xm . Then
~
Y Poisson(.\1 +···+A m ).

21
Example
Suppose X ~ Poisson(-\). Find JE[X] and Var(X).

22
Example
If X ~ Exp(A), find Mx(s). Find lE[X].

23
Example
Suppose X ~ N(O, 1). Find Mx(s).

24
25

You might also like