Lecture 1: To Be Determined
Lecture 1: To Be Determined
Lecture 1: To Be Determined
Lecture 1: To be determined
Lecturer: Jim Pitman
Stochastic Process: Random process, evolution over time/space. Especially
models for sequences of random variables:
X
0
, X
1
, X
2
, . . .
Time = {0, 1, 2, . . . }
Random vector (X
0
, X
1
, . . . , X
N
)
(X
t
, t {0, 1, . . . , N})
(X
t
, 0) Time = [0, 1]
In background always have
Probability space: (, F, P) TRIPLE
Set = set of all possible outcomes
Collection F of events = subsets of
Assume F is closed under
A
2
A
3
. . . is an event;
Axiom: If the A
i
s are disjoint, A
i
A
j
= , i = j, then P(
i=1
A
i
) =
i=1
P(A
i
).
Measurability. P{ : X() x} = F(x). F is called the cumulative distri-
bution function of X. P(X x) = F(x).
Discrete value: List of values x
0
, x
1
, x
2
. . . , often {0, 1, 2, 3, . . . }.
P(X = x
i
) = p
i
. If
i=0
P
i
= 1, we say X has discrete distribution with prob-
abilities (p
i
) at values (x
i
).
E(X) =
i
x
i
p
i
. Technical issue:
i
x
i
p
i
:= lim
N
N
i=0
x
i
p
i
, we need to
assume the limit exists, i.e.,
i=0
|x
i
|p
i
< .
1
Lecture 1: To be determined 2
Other cases:
F(x) = P(X x)
Density case: F(x) =
_
x
f(y)dy
E(X) =
_
xf(x)dx
E(X) =
_
g(x)f(x)dx provided
_
|g(x)|f(x)dx = E|g(X)| <
We start with nave ideas of E(X) dened by
and
_
.
We often encounter situations where
0 X
n
X
0 X
1
X
2
. . . X
0 E(X
n
) E(X)
Claim: X Y = E(X) E(Y ).
Proof: Y = X +Y X. E(Y ) = E(X) +E(Y X) E(X).
Monotone Convergence Theorem: If 0 X
n
X, then 0 E(X
n
)
E(X).
Conditional Probability
P(A|B) =
P(AB)
P(B)
provided P(B) = 0
Easily for a discrete random variable X we can dene
E(Y |X = x)
_
=
i
y
i
dP(Y = y
i
|X = x) discrete
=
_
y
ydP(Y = y|X = x) density
Suppose you know values of E(Y |X = x) for all x of X, how to nd E(Y )?
E(Y ) =
x
E(Y |X = x)
. .
(x)
P(X = x)
= E((x))
Lecture 1: To be determined 3
For a discrete X, E(Y |X) is a random variable, whereas E(Y |X = x) is a value.
So
E(Y ) = E(E(Y |X))
Exercises with random sums
X
1
, X
2
. . . is a sequence of independent and identically distributed random
variables. S
n
:= X
1
+ + X
n
, N is a random variable. For simplicity,
N X
1
, X
2
, . . . Want a formula for E(X
N
) = E(X
1
+ +X
N
) = E(E(S
N
|N)).
E(S
N
|N = n) = E(X
1
+ X
n
) = nE(X
1
)
= E(E(S
N
|N)) = E(NE(X
1
)) = E(N)E(X
1
)
Stopping time: Rule for deciding when to quit.