734 Solutions
734 Solutions
734 Solutions
1
for odd k, 3 ≤ k ≤ n, and,
k
(∪ni=1 Ai ) (−1)j−1
X X
P ≥ P (∩ℓ∈I Aℓ )
j=1 I⊂{1,···,n},|I|=j
for even k, 2 ≤ k ≤ n.
Ans. Prove by induction.
(a) Suppose that P (A) = Q(A) for all A ∈ F with P (A) ≤ 1/2. Prove
that P = Q, i.e., that P (A) = Q(A) for all A ∈ F.
Ans. If P (A) ≤ 1/2, P (A) = Q(A). If P (A) > 1/2, P (Ac ) ≤ 1/2 and
P (Ac ) = Q(Ac ), so P (A) = 1 − P (Ac ) = 1 − Q(Ac ) = Q(A). Hence
P = Q.
(b) Give an example where P (A) = Q(A) for all A ∈ F with P (A) <
1/2, but such that P ̸= Q, i.e., that P (A) ̸= Q(A) for some A ∈ F.
2
√ √
Ans. Since
√ X ≥ 0 a.s., P (X ≤ x) = P (X ≤ x) = 1 − P (X > x),
where x denotes the positive √square-root. Hence the distribution
function of X 2 is F (x) := 1 − e− x for x ≥ 0 and = 0 for x < 0.
8. Let X be a random variable with P(X >0) > 0. Prove that there is a
δ > 0 such that P (X ≥ δ) > 0.
2
Ans. If P (X > δ) = 0 ∀ δ > 0, then P (X > 0) = ∪n≥1 P (X > n1 ) = 0,
a contradiction.
3
Exercise 2
Xn(k) − X ≤ Xn − X ≤ Xn(k+1) − X ∀ k.
1
where the infimum is over all pairs of random variables (X, Y ) such
that the law of X is µ and the law of Y is ν. Show that µn → µ in
P(Rd ) if and only if d(µn , µ) → 0.
E [kXn − Xk ∧ 1] → 0
E [kXn − Zn k ∧ 1] ≤ d(µn , µ) + n .
f dµn → f dµ, so µn → µ.
R R
Since η > 0 is arbitrary,
2
that Yn → Z in law.
d(µn , νn ) ≤ E[|Xn − Yn | ∧ 1] → 0
3
µn be the law of Xn for n ≥ 1. Then for f ∈ RCb (R), f (Xn ) → f (X)
a.s., hence f dµn = E[f (Xn )] → E[f (X)] = f dµ by bounded con-
R
h i
Ans. Let ϕ(t) = E eitX = E[cos(tX) + isin(tX)]. Then
|1 − ϕ(t)|2
= |1 − E[cos(tX)] + iE[sin(tX)]|2
≤ E[|1 − cos(tX) + isin(tX)|2 ]
= E[(1 − cos(tX))2 + sin2 (tX)]
= E[1 + cos2 (tX) + sin2 (tX) − 2cos(tX)]
= 2(1 − E[cos(tX)])
= 2 − (E[cos(tX) + isin(tX)] + E[cos(tX) − isin(tX)])
= 2 − (ϕ(t) + ϕ(t))
= 2(1 − Re(ϕ(t))).
Xn
9. Xn , n ≥ 1, are i.i.d. real random variables. Show that n
→ 0 a.s. if
and only if E[|Xn |] < ∞.
Ans. E[|Xn |] < ∞ ⇐⇒ E[d|Xn |e] < ∞ ⇐⇒ E[b|Xn |c] < ∞. Hence,
letting dxe, bxc denote resp. the smallest integer greater than and the
largest integer less than x, we have
∞
X ∞
X
E[|Xn |] < ∞ ⇐⇒ P (|Xn | ≥ m) < ∞ ⇐⇒ P (|Xm | ≥ m) < ∞)
m=0 m=0
∞
X
⇐⇒ P (|Xm /m| ≥ 1) < ∞.
m=0
Then
E[|Xn |] < ∞ =⇒ E[k|Xn |] < ∞ ∀ k ≥ 1 =⇒
4
X
P (|Xm /m| ≥ 1/k) < ∞ ∀ k ≥ 1 =⇒ |Xm |/m < 1/k
m
for sufficiently large m a.s., for each k. That is, |Xn |/n → 0 a.s.
Similarly,
X
E[|Xn |] = ∞ =⇒ P (|Xm | ≥ 1) = ∞ =⇒ |Xm |/m ≥ 1 i.o.,
m
5
Exercise 3
where the first inequality is the conditional Jensen’s inequality and the
second inequality follows from the submartingale property of {Xn },
viz., E[Xn+1 |Fn ] ≥ Xn a.s., and the monotone increase of f .
(exp(Mn − Bn ), Fn ), n ≥ 0,
is a martingale.
h i
n−1
log E e∆Mm |Fm−1 , m ≥ 0, with F−1 :=
P
Ans. Take Bn := m=0
{φ, Ω}, M−1 = 0. Then {Bn } is predictable and
n
e∆Mm
!
Mn −Bn
Y
Zn := e = .
m=0 E [e∆Mm |Fm−1 ]
1
Therefore
e∆Mn+1
" #
E [Zn+1 |Fn ] = Zn E Fn = Zn .
E [e∆Mn+1 |Fn ]
Let pa = P (Sn hits a before it hits −b) and pb = P (Sn hits −b before
it hits a). Since lim supn↑∞ Sn = − lim inf n↑∞ Sn = ∞ (e.g., by LIL), it
follows that pa + pb = 1. Also, E [ST ] = apa − bpb = 0. Solving the two
b a
equations, pa = a+b , pb = a+b .
Ans. Since
P P
(i) n I{Xn = i} and n P (Xn = i|Xm , m < n) converge or diverge
together a.s.,
(ii) n P (Xn = i|Xm , m < n) ≥ α n I{Xn = j} for any j such that
P P
2
(iii) the graph is connected,
it follows that Xn = i i.o. for some i ∈ S implies Xn = i i.o. for all
i ∈ S. But i∈S n I{Xn = i} = n i∈S I{Xn = i} = n 1 = ∞, so
P P P P P
{T = n} = {[X0 , · · · , Xn ] ∈ Cn0 }
{T = n} = {[XT ∧0 , XT ∧1 , · · ·] ∈ Cn } ∈ F T .
Also,
A = ∪n (A ∩ {T = n}) = ∪n {[X0 , · · · , Xn , Xn , Xn , · · ·] ∈ Bn }
= ∪n ({[X0 , · · · , Xn , Xn , Xn , · · ·] ∈ Bn } ∩ {T = n})
= ∪n ({[XT ∧0 , · · · , XT ∧n , · · ·] ∈ Bn } ∩ {T = n}).
Thus A ∈ F T . Conversely, let A ∈ F T . Then it is of the form A =
{[XT ∧0 , XT ∧1 , · · ·] ∈ B} for some Borel B ⊂ R∞ . Then
A ∩ {T = n} = {[X0 , · · · , Xn , Xn , Xn , · · ·] ∈ B} ∩ {T = n} ∈ Fn ,
so A ∈ FT .
3
E [Xn+i I{Xn = 0}] = 0. Since Xn+i I{Xn = 0} ≥ 0 a.s., it follows from
Chebyshev inequality that Xn+i I{Xn = 0} = 0 a.s., which proves the
claim.
(⇐=) We have
0 = E [(Mn − Mm )g(M0 , · · · , Mm )]
= E [E[Mn − Mm |Fm ]g(M0 , · · · , Mm )]
for all g as above. Since we can approximate indicator functions of open
balls in a bounded fashion by continuous functions, we can extend this
to g = indicator of an open ball in Rm+1 . Then we have
Z Z
E [Mn − Mm |Fm ] dP = 0 = 0dP.
C C
4
for all open balls C ⊂ Rm+1 . Since the sets C for which the above
holds form a σ-field containing all open balls, it includes B(Rm+1 ).
By a.s. uniqueness of conditional expectation, it then follows that
E [Mn − Mm |Fm ] = 0 a.s.
{(Mn∞ , σ(Mm
∞
, m ≤ n))}
is also a martingale.
5
The latter follows because (Λn , Fn ) is a regular martingale. It also fol-
lows that E[|Λn − Λ∞ |] → 0, from which we have E[Λ∞ ] = 1.
6
Exercise 4
Then
P (Yn = i, Yn+1 = j)
P (Yn+1 = j|Yn = i) =
P (Yn = i)
P (XN −n = i, XN −n−1 = j)
=
P (XN −n = i)
P (XN −n−1 = j)p(XN −n = i|XN −n−1 = j) πN −n−1 (j)p(i|j)
= = .
πN −n (i) πN −n (i)
1
3. Show that for a Markov chain {Xn },
P (Xn = i|Xn−1 , Xn−2 , Xn+1 , Xn+2 ) = P (Xn = i|Xn−1 , Xn+1 ).
Ans. On the set {Xn−1 = j, Xn−2 = k, Xn+1 = s, Xn+2 = r}, the left
hand side equals
P (Xn−1 = j, Xn−2 = k, Xn = i, Xn+1 = s, Xn+2 = r)
P (Xn−1 = j, Xn−2 = k, Xn+1 = s, Xn+2 = r)
p(j|k)p(i|j)p(s|i)p(r|s)
=P 0 0
=
i0 p(j|k)p(i |j)p(s|i )p(r|s)
p(i|j)p(s|i)
0 0
= P (Xn = i|Xn−1 = j, Xn+1 = s).
i0 p(i |j)p(s|i )
P
2
6. Let {Xn } be an irreducible Markov chain on a finite state space S and
let A, B ⊂ S, satisfying A ∩ B = φ and A ∪ B 6= S. Let τ := min{n ≥
0 : Xn ∈ A ∪ B}. Show that V (i) = Pi (Xτ ∈ A), i ∈ S, is characterized
by the linear system
X
V (i) = p(j|i)V (j), i ∈
/ A ∪ B,
j
7. Let {Xn } be a Markov chain with state space S and P the probability
measure induced on (S ∞ , F := B(S ∞ )) by [X0 , X1 , X2 , · · ·]. Let Fn :=
B(S n ) and Pn := the restriction of P to (S ∞ , Fn ) for n ≥ 0. Clearly,
F = ∨n Fn . Let (Λn , Fn ), n ≥ 0, be a non-negative martingale with
mean 1. Define a measure Q onR (S ∞ , F) by: the restriction Qn of Q to
(S ∞ , Fn ) is given by Qn (A) = A Λn dP, A ∈ Fn .
(a) Show that for any bounded random variable Y on (S ∞ , Fn , P ) and
m < n,
EP [Y Λn |Fm ]
EQ [Y |Fm ] = .
EP [Λn |Fm ]
Ans. For A ∈ Fm , by the definition of Q,
! !
Z
EP [Y Λn |Fm ] Z
EP [Y Λn |Fm ]
dQ = Λn dP
A EP [Λn |Fm ] A EP [Λn |Fm ]
" ! #
Z
EP [Y Λn |Fm ]
= E Λn Fm dP
A EP [Λn |Fm ]
!
Z
EP [Y Λn |Fm ]
= E[Λn |Fm ]dP
A EP [Λn |Fm ]
Z Z Z
= EP [Y Λn |Fm ]dP = Y Λn dP = Y dQ.
A A A
The claim follows from the a.s. uniqueness of conditional expec-
tation.
3
(b) Using the above result, show that under Q, {Xn } is a Markov
Qn−1
chain if and only if Λn is of the form m=0 λm (Xm , Xm+1 ) for
some λ : S 2 7→ R+ .
Ans. If Λn are of this form, then
EP [f (Xn+1 ) nm=0 λm (Xm , Xm+1 )|Fn ]
Q
EQ [f (Xn+1 )|Fn ] =
EP [ nm=0 λm (Xm , Xm+1 )|Fn ]
Q
" τ −1 #
X
V (i) = Ei (f (Xm ) − βf ) , i ∈ S.
m=0
For i 6= i0 , we have
" τ −1 #
X
V (i) = Ei (f (Xm ) − βf )
m=0
−1
τX
" #
= f (i) − βf + Ei I{τ > 1} (f (Xm ) − βf )
m=1
4
" " τ −1 ##
X
= f (i) − βf + Ei I{τ > 1}E (f (Xm ) − βf )|X1
m=1
= f (i) − βf + Ei [I{τ > 1}V (X1 )]
= f (i) − βf + Ei [V (X1 )]
(because V (i0 ) = 0)
X
= f (i) − βf + p(j|i)V (j).
j
P
by π(i), sum over i, and use the fact that i π(i)p(j|i) = π(j)
to obtain i π(i)f (i) = β 0 , so β 0 = βf . Thus subtracting the
P