Time Series Solution
Time Series Solution
Time Series Solution
com/solution-manual-for-time-series-analysis-jonathan-cryer-kung-sik-chan/
CHAPTER 1
Exercise 1.1 Use software to produce the time series plot shown in Exhibit (1.2), page 2. The following R code will
produce the graph.
> library(TSA); data(larain); win.graph(width=3,height=3,pointsize=8)
> plot(y=larain,x=zlag(larain),ylab='Inches',xlab='Previous Year Inches')
Exercise 1.2 Produce the time series plot displayed in Exhibit (1.3), page 3. Use the R code
> data(color); plot(color,ylab='Color Property',xlab='Batch',type='o')
Exercise 1.3 Simulate a completely random process of length 48 with independent, normal values. Repeat this exer-
cise several times with a new simulation, that is, a new seed, each time.
> plot(ts(rnorm(n=48)),type='o') # If you repeat this command R will use a new “random
numbers” each time. If you want to reproduce the same simulation first use the
command set.seed(#########) where ######### is an integer of your choice.
Exercise 1.4 Simulate a completely random process of length 48 with independent, chi-square distributed values
each with 2 degrees of freedom. Use the same R code as in the solution of Exercise 1.3 but replace rnorm(n=48)
with rchisq(n=48,df=2).
Exercise 1.5 Simulate a completely random process of length 48 with independent, t-distributed values each with 5
degrees of freedom. Construct the time series plot. Use the same R code as in the solution of Exercise 1.3 but
replace rnorm(n=48) with rt(n=48,df=5).
Exercise 1.6 Construct a time series plot with monthly plotting symbols for the Dubuque temperature series as in
Exhibit (1.7), page 6. (Make the plot full screen so that you can see all of detail.)
> data(tempdub); plot(tempdub,ylab='Temperature')
> points(y=tempdub,x=time(tempdub), pch=as.vector(season(tempdub)))
CHAPTER 2
Exercise 2.1 Suppose E(X) = 2, Var(X) = 9, E(Y) = 0, Var(Y) = 4, and Corr(X,Y) = 0.25. Find:
(a) Var(X + Y) = Var(X) + Var(Y) +2Cov(X,Y) = 9 + 4 + 2(3*2*0.25) = 16
(b) Cov(X, X + Y) = Cov(X,X) + Cov(X,Y) = 9 + ((3*2*0.25) = 9 + 3/2 = 10.5
(c) Corr(X + Y, X − Y). As in part (a), Var(X−Y) = 10. Then Cov(X + Y, X − Y) = Cov(X,X) − Cov(Y,Y) + Cov(X,Y)
− Cov(X,Y) = Var(X) − Var(Y) = 9 − 4 = 5. So
Cov ( X + Y, X – Y ) 5 5
Corr ( X + Y, X – Y ) = ------------------------------------------------------------ = ---------------------- = ------------- = 0.39528471
Var ( X + Y )Var ( X – Y ) 16 × 10 4 10
Exercise 2.2 If X and Y are dependent but Var(X) = Var(Y), find Cov(X + Y, X − Y).
Cov(X + Y, X − Y) = Cov(X,X) − Cov(Y,Y) + Cov(X,Y) − Cov(Y,X) = Var(X) − Var((Y) = 0
Exercise 2.3 Let X have a distribution with mean μ and variance σ2 and let Yt = X for all t.
(a) Show that {Yt} is strictly and weakly stationary. Let t1, t2,…, tn be any set of time points and k any time lag.
Then
1
https://ebookyab.com/solution-manual-for-time-series-analysis-jonathan-cryer-kung-sik-chan/
Pr ( Y t < y t , Y t ≤ y t , …, Y t ≤ y t ) = Pr ( X < y t , X ≤ y t , …, X ≤ y t )
1 1 2 2 n n 1 2 n
= Pr ( Y t – k < y t , Y t – k ≤ y t , …, Y t – k ≤ y t )
1 1 2 2 n n
as required for strict stationarity. Since the autocovariance clearly exists, (see part (b)), the process is also weakly
stationary.
(b) Find the autocovariance function for {Yt}. Cov(Yt,Yt − k) = Cov(X,X) = σ2 for all t and k, free of t (and k).
(c) Sketch a “typical” time plot of Yt. The plot will be a horizontal “line” (really a discrete-time horizontal line)
at the height of the observed X.
Exercise 2.4 Let {et} be a zero mean white noise processes. Suppose that the observed process is Yt = et + θet − 1
where θ is either 3 or 1/3.
(a) Find the autocorrelation function for {Yt} both when θ = 3 and when θ =1/3. E(Yt) = E(et + θet−1) = 0.
Also Var(Yt) = Var(et + θet − 1) = σ2 + θ2σ2 = σ2 (1 + θ2). Also Cov(Yt,Yt − 1) = Cov(et + θet − 1, et − 1 + θet − 2) = θσ2
free of t. Now for k > 1, Cov(Yt,Yt − k) = Cov(et + θet − 1, et − k + θet − k − 1) = 0 since all of these error terms are
uncorrelated. So
⎧ 1 for k = 0
⎪
Cov ( Y t, Y t – k ) ⎪ θσ 2 θ
Corr ( Y t, Y t – k ) = --------------------------------------------------
- = ⎨ -------------------------
- = --------------- for k = 1
2 2 2
Var ( Y t )Var ( Y t – k ) ⎪σ (1 + θ ) 1+θ
⎪
⎩ 0 for k > 1
But 3/(1+32) = 3/10 and (1/3)/[1+(1/3)2]
= 3/10. So the autocorrelation functions are identical.
(b) You should have discovered that the time series is stationary regardless of the value of θ and that the autocor-
relation functions are the same for θ = 3 and θ = 1/3. For simplicity, suppose that the process mean is known
to be zero and the variance of Yt is known to be 1. You observe the series {Yt} for t = 1, 2,..., n and suppose
that you can produce good estimates of the autocorrelations ρk. Do you think that you could determine
which value of θ is correct (3 or 1/3) based on the estimate of ρk? Why or why not?
Exercise 2.5 Suppose Yt = 5 + 2t + Xt where {Xt} is a zero mean stationary series with autocovariance function γk.
(a) Find the mean function for {Yt}. E(Yt) = E(5 + 2t + Xt) = 5 + 2t + E(Xt) = 5 + 2t.
(b) Find the autocovariance function for {Yt}.
Cov(Yt,Yt − k) = Cov(5 + 2t + Xt, 5 + 2(t − k) + Xt − k) = Cov(Xt,Xt − k) = γk free of t.
(c) Is {Yt} stationary? (Why or why not?) In spite of part (b), The process {Yt} is not stationary since its mean
varies with time.
Exercise 2.6 Let {Xt} be a stationary time series and define
⎧X for t odd
Yt = ⎨ t
⎩ Xt + 3 for t even
(a) Show that Cov ( Y t , Y t – k ) is free of t for all lags k.
Cov(Yt,Yt − k) = Cov(Xt + 3,Xt − k + 3) = Cov(Xt,Xt−k) is free of t since {Xt} is stationary.
(b) Is {Yt} stationary? {Yt} is not stationary since E(Yt) = E(Xt) = μX for t odd but E(Yt) = E(Xt + 3) = μX + 3 for
t even.
Exercise 2.7 Suppose that {Yt} is stationary with autocovariance function γk.
(a) Show that Wt = ∇Yt = Yt − Yt−1 is stationary by finding the mean and autocovariance function for {Wt}.
E(Wt) = E(Yt − Yt − 1) = E(Yt) − E(Yt − 1) = 0 since {Yt} is stationary. Also
Cov(Yt,Yt − k) = Cov(Yt − Yt − 1,Yt − k − Yt − k − 1) = Cov(Yt,Yt − k) − Cov(Yt, Yt − k − 1) − Cov(Yt − 1,Yt − k) +
Cov(Yt − 1,Yt − k − 1)
= γk − γk + 1 − γk − 1 + γk = 2γk − γk + 1 − γk − 1, free of t.
(b) Show that Ut = ∇2Yt = ∇[Yt − Yt − 1] = Yt − 2Yt − 1 + Yt − 2 is stationary. (You need not find the mean and auto-
covariance function for {Ut}.) Ut is the first difference of the process{∇Yt}. By part (a), {∇Yt} is stationary.
So Ut is the difference of a stationary process and, again by part (a), is itself stationary.
2
https://ebookyab.com/solution-manual-for-time-series-analysis-jonathan-cryer-kung-sik-chan/
Exercise 2.8 Suppose that {Yt} is stationary with autocovariance function γk. Show that for any fixed positive integer
n and any constants c1, c2,..., cn, the process {Wt} defined by W t = c 1 Y t + c 2 Y t – 1 + … + c n Y t – n + 1 is station-
ary. First
E ( W ) = c EY + c EY
t 1 t + … + c EY
2 t–1 = ( c + c + … + c )μ free of t. Also
n t–n+1 1 2 n Y
Cov ( W t, W t – k ) = Cov ( c 1 Y t + c 2 Y t – 1 + … + c n Y t – n + 1, c 1 Y t – k + c 2 Y t – 1 – k + … + c n Y t – n + 1 – k )
n n n n
= ∑ ∑ c j c i Cov ( Y t – j, Y t – k – i ) = ∑ ∑ cj ci γj – k – i free of t.
j=0i=0 j=0i=0
Exercise 2.9 Suppose Yt = β0 + β1t + Xt where {Xt} is a zero mean stationary series with autocovariance function γk
and β0 and β1 are constants.
(a) Show that {Yt} is not stationary but that Wt = ∇Yt = Yt − Yt − 1 is stationary. {Yt} is not stationary since its
mean, β0 + β1t, varies with t. However, E(Wt) = E(Yt − Yt − 1) = (β0 + β1t) − (β0 + β1(t − 1)) = β1, free of t.
The argument in the solution of Exercise 2.7 shows that the covariance function for {Wt} is free of t.
(b) In general, show that if Yt = μt + Xt where {Xt} is a zero mean stationary series and μt is a polynomial in t of
degree d, then ∇mYt = ∇(∇m − 1Yt) is stationary for m ≥ d and nonstationary for 0 ≤ m < d.
Use part (a) and proceed by induction.
Exercise 2.10 Let {Xt} be a zero-mean, unit-variance stationary process with autocorrelation function ρk. Suppose
that μt is a nonconstant function and that σt is a positive-valued nonconstant function. The observed series is formed
as Yt = μt + σtXt
(a) Find the mean and covariance function for the {Yt} process.
Notice that Cov(Xt,Xt − k) = Corr(Xt,Xt − k) since {Xt} has unit variance. E(Yt ) = E(μt + σtXt) = μt + σtE(Xt) = μt.
Now Cov(Yt,Yt − k) = Cov(μt + σtXt,μt − k + σt − kXt − k) = σtσt − kCov(Xt,Xt − k) = σtσt − kρk. Notice that Var(Yt) =
(σt)2.
(b) Show that autocorrelation function for the {Yt} process depends only on time lag. Is the {Yt} process station-
ary? Corr(Yt,Yt − k) = σtσt − kρk/[σtσt − k] = ρk but {Yt} is not necessarily stationary since E(Yt ) = μt.
(c) Is it possible to have a time series with a constant mean and with Corr(Yt ,Yt − k) free of t but with {Yt} not
stationary? If μt is constant but σt varies with t, this will be the case.
Exercise 2.11 Suppose Cov(Xt ,Xt − k) = γk is free of t but that E(Xt) = 3t.
(a) Is {Xt} stationary? No since E(Xt) varies with t.
(b) Let Yt = 7 − 3t + Xt. Is {Yt} stationary? Yes, since the covariances are unchanged but now E(Xt) = 7 − 3t + 3t
= 7, free of t.
Exercise 2.12 Suppose that Yt = et − et − 12. Show that {Yt} is stationary and that, for k > 0, its autocorrelation func-
tion is nonzero only for lag k = 12.
E(Yt) = E(et − et − 12) = 0. Also Cov(Yt,Yt − k) = Cov(et − et − 12,et − k − et − 12 − k) = −Cov(et − 12,et − k) = −(σe)2 when
k = 12. It is nonzero only for k = 12 since, otherwise, all of the error terms involved are uncorrelated.
Exercise 2.13 Let Y t = e t – θe t2– 1 . For this exercise, assume that the white noise series is normally distributed.
(a) Find the autocorrelation function for {Yt}. First recall that for a zero-mean normal distribution
E ( e t3– 1 ) = 0 and E ( e t4– 1 ) = 3σ e4 . Then E ( Y t ) = – θVar ( e t – 1 ) = – θσ e2 which is constant in t and
Var ( Y t ) = Var ( e t ) + θ 2 Var ( e t2– 1 ) = σ e2 + θ 2 { E ( e t4– 1 ) – [ E ( e t2– 1 ) ] 2 }
= σ e2 + θ 2 { 3σ e4 – [ σ e2 ] 2 }
= σ e2 + 2θ 2 σ e4
3
https://ebookyab.com/solution-manual-for-time-series-analysis-jonathan-cryer-kung-sik-chan/
Exercise 2.14 Evaluate the mean and covariance function for each of the following processes. In each case determine
whether or not the process is stationary.
(a) Yt = θ0 + tet . The mean is θ0 but it is not stationary since Var(Yt) = t2Var(et ) = t2σ2 is not free of t.
(b) Wt = ∇Yt where Yt is as given in part (a). Wt = ∇Yt = (θ0 + tet )−(θ0 + (t−1)et − 1 ) = tet −(t−1)et − 1
So the mean of Wt is zero. However, Var(Wt) = [t2 + (t−1)2](σe)2 which depends on t and Wt is not stationary.
(c) Yt = et et − 1. (You may assume that {et } is normal white noise.) The mean of Yt is clearly zero. Lag one is the
only lag at which there might be correlation. However, Cov(Yt,Yt − 1) = E(et et − 1et − 1 et − 2)
= E(et ) E[et − 1]2E(et − 2) = 0. So the process Yt = et et − 1 is stationary and is a non-normal white noise!
Exercise 2.15 Suppose that X is a random variable with zero mean. Define a time series by Yt = (−1)tX.
(a) Find the mean function for {Yt}. E(Yt) = (−1)tE(X) = 0.
(b) Find the covariance function for {Yt}. Cov(Yt,Yt − k) = Cov[(−1)tX,(−1)t − kX] = (−1)2t − kCov(X,X) =
(−1)k(σX)2
(c) Is {Yt} stationary? Yes, the mean is constant and the covariance only depends on lag.
Exercise 2.16 Suppose Yt = A + Xt where {Xt} is stationary and A is random but independent of {Xt}. Find the mean
and covariance function for {Yt} in terms of the mean and autocovariance function for {Xt} and the mean and vari-
ance of A. First E(Yt) = E(A) + E(Xt) = μA + μX, free of t. Also, since {Xt} and A are independent,
X
Cov ( Y t, Y t – k ) = Cov ( A + X t, A + X t – k ) = Cov ( A, A ) + Cov ( X t, X t – k ) = Var ( A ) + γ k free of t·
_ 1 n
Exercise 2.17 Let {Yt} be stationary with autocovariance function γk. Let Y = --- ∑t = 1 Y t . Show that
n
_ γ0 2 n – 1 ⎛ k ⎞
Var ( Y ) = ----- + --- ∑ 1 – --- γ k
n nk = 1⎝ n⎠
1 n–1 ⎛
1 – -----⎞ γ k
k
n k=∑
= ---
⎝ n ⎠
–n+1
_ 1 1 1 n
Var ( Y ) = ----- Var [ ∑t = 1 Y t ] = ----- Cov [ ∑t = 1 Y t, ∑s = 1 Y s ] = ----- ∑t = 1 ∑s = 1 γ t – s
n n n n
n2 n2 n2
Now make the change of variable t − s = k and t = j in the double sum. The range of the summation
{1 ≤ t ≤ n, 1 ≤ s ≤n} is transformed into {1 ≤ j ≤ n, 1 ≤ j − k ≤ n} = {k + 1 ≤ j ≤ n + k, 1 ≤ j ≤ n} which may be
written { k > 0, k + 1 ≤ j ≤ n } ∪ { k ≤ 0, 1 ≤ j ≤ n + k } . Thus
_ 1 n–1 n 0 n+k
Var ( Y ) = ----- [ ∑k = 1 ∑j = k + 1 γ k + ∑k = – n + 1 ∑j = 1 γ k ]
n2
1 n–1 0
= ----- [ ∑k = 1 ( n – k )γ k + ∑k = – n + 1 ( n + k )γ k ]
n2
= --- ∑k = – n + 1 ⎛ 1 – -----⎞ γ
1 n–1 k
n ⎝ n⎠ k
Use γk = γ−k to get the first expression in the exercise.
Exercise 2.18 Let {Yt} be stationary with autocovariance function γk. Define the sample variance as
1 n _
S 2 = ------------ ∑ ( Y t – Y ) 2 .
n – 1t = 1
n n _ _
(a) First show that ∑ ( Y t – μ ) 2 = ∑ ( Y t – Y ) 2 + n ( Y – μ ) 2 .
t=1 t=1
n n _ _ n _ n _ n _ _
∑ t
( Y – μ ) 2 =
∑ t + Y – μ )2 =
( Y – Y ∑ ( Yt – Y ) 2 + ∑ ( Y – μ )2 + 2 ∑ ( Yt – Y ) ( Y – μ )
t=1 t=1 t=1 t=1 t=1
n _ _ _ n _ n _ _
= ∑ ( Yt – Y ) 2 + n ( Y – μ )2 + 2 ( Y – μ ) ∑ ( Yt – Y ) = ∑ ( Yt – Y )2 + n ( Y – μ ) 2
t=1 t=1 t=1
_ 2 n–1
(b) Use part (a) to show that E ( S 2 ) = ------------ γ 0 – ------------ Var ( Y ) = γ 0 – ------------ ∑ ⎛ 1 – --k-⎞ γ k .
n n
n–1 n–1 n – 1k = 1⎝ n⎠
(Use the results of Exercise (2.17) for the last expression.)
4
https://ebookyab.com/solution-manual-for-time-series-analysis-jonathan-cryer-kung-sik-chan/
⎛ _ ⎞ ⎛ n _ ⎞ n _
E ( S 2 ) = E ⎜ ------------ ∑ ( Y t – Y ) 2⎟ = ------------ E ⎜ ∑ ( Y t – μ ) 2 – n ( Y – μ ) 2⎟ = ------------ ∑ E [ ( Y t – μ ) 2 ] – nE ( Y – μ ) 2
1 n 1 1
⎜n – 1 ⎟ n – 1 ⎜t = 1 ⎟ n–1 t=1
⎝ t=1 ⎠ ⎝ ⎠
⎧ ⎫
_ ⎪ γ0 2 n – 1 ⎛ k⎞ ⎪ 2 n–1
nγ 0 – n ⎨ ----- + ∑ 1 – --- γ k ⎬ = γ 0 – ------------ ∑ ⎛ 1 – ---⎞ γ k
1 1 k
= -----------
- [ nγ 0 – nVar ( Y ) ] = -----------
- --
-
n–1 n–1 n n ⎝ n⎠ n – 1 ⎝ n ⎠
⎪ k=1 ⎪ k=1
⎩ ⎭
(c) If {Yt} is a white noise process with variance γ0, show that E(S2) = γ0. This follows since for white noise γk
= 0 for k > 0.
Exercise 2.19 Let Y1 = θ0 + e1 and then for t > 1 define Yt recursively by Yt = θ0 + Yt − 1 + et. Here θ0 is a constant.
The process {Yt} is called a random walk with drift.
(a) Show that Yt may be rewritten as Y t = tθ 0 + e t + e t – 1 + … + e 1 . Substitute Yt − 1 = θ0 + Yt − 2 + et − 1 into
Yt = θ0 + Yt − 1 + et and repeat until you get back to e1.
(b) Find the mean function for Yt. E ( Y t ) = E ( tθ 0 + e t + e t – 1 + … + e 1 ) = tθ 0
(c) Find the autocovariance function for Yt.
Cov ( Y t, Y t – k ) = Cov [ tθ 0 + e t + e t – 1 + … + e 1, ( t – k )θ 0 + e t – k + e t – 1 – k + … + e 1 ]
= Cov [ e t – k + e t – 1 – k + … + e 1, e t – k + e t – 1 – k + … + e 1 ]
= Var ( e t–k +e t–1–k + … + e ) = ( t – k )σ 2
1 e for t ≥ k
Exercise 2.20 Consider the standard random walk model where Yt = Yt − 1 + et with Y1 = e1.
(a) Use the above representation of Yt to show that μt = μt−1 for t > 1 with initial condition μ1 = E(e1) = 0. Hence
show that μt = 0 for all t. Clearly, μ1 = E(Y1) = E(e1) = 0. Then E(Yt) = E(Yt − 1 + et) = E(Yt − 1) + E(et) =
E(Yt − 1) or μt = μt − 1 for t > 1 and the result follows by induction.
(b) Similarly, show that Var(Yt) = Var(Yt − 1) + σ e2 , for t > 1 with Var(Y1) = σ e2 , and, hence Var(Yt) = t σ e2 .
Var(Y 1 ) = σ e2 is immediate. Then Var ( Y t ) = Var ( Y t – 1 + e t ) = Var ( Y t – 1 ) + Var ( e t ) = Var ( Y t – 1 ) + σ e2 .
Recursion or induction on t yields Var(Yt) = t σ e2 .
(c) For 0 ≤ t ≤ s, use Ys = Yt + et + 1 + et + 2 + … + es to show that Cov(Yt, Ys) = Var(Yt) and, hence, that Cov(Yt,
Ys) = min(t, s) σ e2 . For 0 ≤ t ≤ s,
Cov ( Y t, Y s ) = Cov ( Y t, Y t + e t + 1 + e t + 2 + … + e s ) = Cov ( Y t, Y t ) = Var ( Y t ) = tσ e2 and hence the result.
Exercise 2.21 A random walk with random starting value. Let Y t = Y 0 + e t + e t – 1 + … + e 1 for t > 0 where Y0 has
a distribution with mean μ0 and variance σ 02 . Suppose further that Y0, e1,..., et are independent.
(a) Show that E(Yt) = μ0 for all t.
E ( Yt ) = E ( Y0 + et + et – 1 + … + e1 ) = E ( Y0 ) + E ( et ) + E ( et – 1 ) + … + E ( e1 ) = E ( Y 0 ) = μ0 .
(b) Show that Var(Yt) = t σ e2 + σ 02 .
Var ( Y t ) = Var ( Y 0 + e t + e t – 1 + … + e 1 ) = Var ( Y 1 ) + Var ( e t ) + Var ( e t – 1 ) + … + Var ( e 1 ) = σ 02 + tσ e2
(c) Show that Cov(Yt, Ys) = min(t, s) σ e2 + σ 02 . Let t be less than s. Then, as in the previous exercise,
Cov ( Y t, Y s ) = Cov ( Y t, Y t + e t + 1 + e t + 2 + … + e s ) = Var ( Y t ) = σ 02 + tσ e2
tσ a2 + σ 02
(d) Show that Corr ( Y t, Y s ) =
---------------------
- for 0 ≤ t ≤ s . Just use the results of parts (b) and (c).
sσ a2 + σ 02
Exercise 2.22 Let {et} be a zero-mean white noise process and let c be a constant with |c| < 1. Define Yt recursively
by Yt = cYt−1 + et with Y1 = e1.
This exercise can be solved using the recursive definition of Yt or by expressing Yt explicitly using repeated substi-
tution as Y t = c ( cY t – 2 + e t – 1 ) + e t = … = e t + ce t – 1 + c 2 e t – 2 + … + c t – 1 e 1 . Parts (c), (d), and (e) essen-
tially assume you are working with the recursive version of Yt but they can also be solved using this explicit
representation.