Time Series-Ch02

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

Ch2.

Fundamental Concepts

Time Series Analysis

Time Series Analysis Ch2. Fundamental Concepts


2.1 Time Series and Stochastic Processes
An observed time series is often modeled by stochastic process:
Definition 1
A sequence of random variables {Yt : t = 0, ±1, ±2, ±3, · · · } is
called a stochastic process.

The complete probabilistic structure of a stochastic process


{Yt : t = 0, ±1, ±2, ±3, · · · } is determined by the set of
distributions of all finite collections of the Y ’s. However, much
information of the stochastic process can be simply obtained from
the means, variances, and covariances.

Time Series Analysis Ch2. Fundamental Concepts


2.2. Means, Variances, and Covariances
In general, let X , Y , Z be r.v.s.
Theorem 2
Basic properties of µX = E (X ):
R +∞
1 If h(x) is a function, then E (h(X )) = −∞ h(x)f (x) dx.
2 For constants a, b, c,
E (aX + bY + c) = aE (X ) + bE (Y ) + c.

Time Series Analysis Ch2. Fundamental Concepts


Cov (X , Y ) is indeed an inner product of the projections of random
variables X and Y onto the zero mean subspace of random variable
function space. So it satisfies the properties of an inner product.
Theorem 3
Basic properties of the variance/covariance:
Variance:
1 σX2 = Var (X ) = Cov (X , X ) = E ((X − µx )2 ) ≥ 0.
2 Var (a + bX ) = b 2 Var (X ).
3 Var (X + Y ) = Var (X ) + 2Cov (X , Y ) + Var (Y ).
Covariance:
1 Cov (X , Y ) = Cov (Y , X ).
2 Cov (a + bX , c + dY ) = Cov (bX , dY ) = bd Cov (X , Y ).
3 Cov (aX + bY , Zp) = aCov (X , Z ) + bCov (Y , Z ).
4 |Cov (X , Y )| ≤ Var (X )Var (Y ).

Time Series Analysis Ch2. Fundamental Concepts


Theorem 4
Basic Properties of the correlation
ρ = Corr (X , Y ) = √ Cov (X ,Y ) :
Var (X )Var (Y )
1 −1 ≤ Corr (X , Y ) = Corr (Y , X ) ≤ 1.
2 Corr (a+bX , c +dY ) = Corr (bX , dY ) = sign(bd) Corr (X , Y ).
3 Corr (X , Y ) = ±1 iff there are constants a and b such that
Prob(Y = a + bX ) = 1.

Theorem 5
Two r.v.s X and Y are independent iff any one of the following
equalities holds:
Cov (X , Y ) = 0,
Corr (X , Y ) = 0,
Var (X + Y ) = Var (X ) + Var (Y ).

Time Series Analysis Ch2. Fundamental Concepts


Means, autocovariances, and autocorrelations:
Consider a stochastic process {Yt : t = 0, ±1, ±2, ±3, · · · }. Suppose the
r.v. Yt has the probability density function (p.d.f) ft (x).

Definition 6
The mean function is defined by
Z ∞
µt = E (Yt ) = xft (x) dx, t = 0, ±1, ±2, · · ·
−∞

The autocovariance function, γt,s , is defined as

γt,s = Cov (Yt , Ys ) = E ((Yt −µt )(Ys −µs )), t, s = 0, ±1, ±2, · · ·

In particular, γt,t = Cov (Yt , Yt ) is called the variance of Yt .


The autocorrelation function, ρt,s , is given by

Cov (Yt , Ys ) γt,s


ρt,s = Corr (Yt , Ys ) = p =√ .
Var (Yt )Var (Ys ) γt,t γs,s

Time Series Analysis Ch2. Fundamental Concepts


We immediately see that:
γt,t = Var (Yt ) ≥ 0,
ρt,t = 1,
−1 ≤ ρt,s = ρs,t ≤ 1,
γt,s = γs,t ,

|γt,s | ≤ γt,t γs,s .
If ρt,s = 0, we say that Yt and Ys are uncorrelated.

Time Series Analysis Ch2. Fundamental Concepts


Example. Random Walk:
Let e1 , e2 , · · · be a sequence of independent, identically distributed
random variables (i.i.d. r.v.s, i.e. white noise), each with zero
mean and variance σe2 . The random walk process is defined by:

Y1 = e1 ,
Y2 = e1 + e2 ,
..
.
Yt = e1 + e2 + · · · + et .

Alternatively, we can write

Yt = Yt−1 + et , Y1 = e1 .

This is an autoregressive process (to be discussed later). If the e’s


are interpreted as the sizes of the steps taken (forward or
backward) along a number line, then Yt is the position of the
random walker at time t.
Time Series Analysis Ch2. Fundamental Concepts
Direct calculation shows that:
µt = 0,
γt,t = Var(Yt ) = tσe2 ,
for t ≤ s: γt,s = Cov (Yt , Ys ) = tσe2 ,
r
t
ρt,s = Corr (Yt , Ys ) = .
s
Exhibit 2.1 shows a simulated random walk. Another simulation is
also given in chap2.R. Alternatively, we may define a function.
# Define the plot function rw(N) for a random walk
with N steps.
rw<-function(N){
Y=ts(cumsum(rnorm(N)),freq=1,start=1)
plot(Y,ylab=’Random Walk’,xlab=’Time’,type=’o’)
}
set.seed(13579)
rw(100) # Simulate a random walk of step 100

Time Series Analysis Ch2. Fundamental Concepts


Example. A Moving Average: Suppose {Yt } is constructed as
et + et−1
Yt = .
2
We calculate that
µt = E(Yt ) = 0,
 
et + et−1 1
Var(Yt ) = Var = σe2 ,
2 2
σ2
 
et + et−1 et−1 + et−2
Cov (Yt , Yt−1 ) = Cov , = e,
2 2 4
 
et + et−1 et−k + et−k−1
Cov(Yt , Yt−k ) = Cov , = 0, for k > 1.
2 2

Therefore,
 
2
0.5σe , for |t − s| = 0
 1,
 for |t − s| = 0
γt,s = 0.25σe2 , for |t − s| = 1 , ρt,s = 0.5, for |t − s| = 1 .
 
0, for |t − s| > 1 0, for |t − s| > 1
 

In this example, the values of γt,t−k and ρt,t−k depend only on k (but
not on t). This is related to the stationarity of the stochastic process.
Time Series Analysis Ch2. Fundamental Concepts
2.3. Stationarity
Simplifying assumptions are needed to study the structure of a stochastic
process. The most important one is stationarity. The basic idea of
stationarity is that the probability laws that govern the behavior of the
process do not change over time.
Definition 7
A process {Yt } is said to be strictly stationary if the joint distribution
of Yt1 , Yt2 , · · · , Ytn is the same as the joint distribution of
Yt1 −k , Yt2 −k , · · · , Ytn −k for all choices of time points t1 , t2 , · · · , tn and all
choices of time lag k.

Suppose {Yt } is strictly stationary. Set n = 1, then all Y ’s are identically


distributed. So

E(Yt ) = E(Ys ), Var(Yt ) = Var(Ys ). (1)

Set n = 2. The joint distribution of Yt and Ys is the same as that of


Yt−k and Ys−k for any lag k.

Cov(Yt , Ys ) = Cov(Yt−k , Ys−k ). (2)

Time Series Analysis Ch2. Fundamental Concepts


Definition 8
A stochastic process {Yt } is said to be weakly stationary if it
satisfies (1) and (2) for all indices t and s, that is:
1 All Y ’s have the same mean.
2 γt,s = γt−k,s−k for all times t and s and lag k.

Weakly stationarity assumptions are usually sufficient to build


models. For a stationary process {Yt }, we define

γk := Cov(Yt , Yt−k ), ρk := Corr (Yt , Yt−k ) . (3)

Theorem 9
The following are true for a stationary process:
1 γ0 = Var(Yt ), |γk | ≤ γ0 , γk = γ−k .
γk
2 ρ0 = 1, ρk = , |ρk | ≤ 1, ρk = ρ−k .
γ0

Time Series Analysis Ch2. Fundamental Concepts


Example. White Noise:
Definition 10
A sequence of independent, identically distributed random variables
(i.i.d. r.v.s) {et : t = 0, ±1, ±2, ±3, · · · } is called a white noise
process.

A white noise process is strictly stationary!


We usually assume that the white noise has a zero mean, i.e.
µt = E (et ) = 0, and denote the variance Var(et ) = σe2 .
( (
Var(et ) = σe2 , for k = 0 1, for k = 0
γk = , ρk = .
0, for k 6= 0 0, for k 6= 0

Many useful processes can be constructed from white noise.

Time Series Analysis Ch2. Fundamental Concepts


et + et−1
Ex. The moving average process {Yt } with Yt = is a
2
stationary process, where

1
 for k = 0
ρk = 0.5 for |k| = 1

0 for |k| > 1

Ex. The random walk process {Yt } with Yt = e1 + e2 + · · · + et


is not a stationary process. For 1 ≤ t ≤ s, we have shown that

γt,s = tσe2 6= γt−1,s−1 .

However, define the difference operator ∇Yt = Yt − Yt−1 . Then


the difference series {∇Yt } = {et } is stationary.

Many nonstationary series can be transformed to stationary series


by taking difference operations a few times.

Time Series Analysis Ch2. Fundamental Concepts


Exercises related to sample mean and sample variance:

Ex. (EX2.17) Let {Yt } be stationary with autocovariance function


n
1X
γk and autocorrelation function ρk . Let Y = Yt . Show that
n
t=1

n−1  
1 X |k|
Var(Y) = 1− γk
n n
k=−n+1
n−1  
γ0 2X k
= + 1− γk
n n n
k=1
n−1 
"  #
γ0 X k
= 1+2 1− ρk
n n
k=1

Time Series Analysis Ch2. Fundamental Concepts


Ex. (EX2.18) Let {Yt } be stationary with autocovariance function
γk . Define the sample variance as
n
2 1 X 2
S = Yt − Y .
n−1
t=1

(a) First show that for any µ,


n n
X 2
X 2 2
(Yt − µ) = Yt − Y + n Y − µ .
t=1 t=1
(b) Use part (a) and EX2.17 to show that
n−1  
2 n n 2 X k
E(S ) = γ0 − Var(Y) = γ0 − 1− γk .
n−1 n−1 n−1 n
k=1

(c) If {Yt } is a white noise process with variance γ0 , show that


E(S2 ) = γ0 . In other words, S 2 is a unbiased estimate of the
variance of Y ’s.
Time Series Analysis Ch2. Fundamental Concepts

You might also like