Lec 3 Autoregressive Moving Average (ARMA) Models and Their Practical Applications20200209000406
Lec 3 Autoregressive Moving Average (ARMA) Models and Their Practical Applications20200209000406
Lec 3 Autoregressive Moving Average (ARMA) Models and Their Practical Applications20200209000406
Winter/Spring 2020
Overview
Moving average processes
Autoregressive processes: moments and the Yule-Walker
equations
Wold’s decomposition theorem
Moments, ACFs and PACFs of AR and MA processes
Mixed ARMA(p, q) processed
Model selection: SACF and SPACF vs. information criteria
Model specification tests
Forecasting with ARMA models
A few examples of applications
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 2
Moving Average Process
𝑦 𝜇 𝜖 𝜓 𝜖 𝜓 𝜖 ⋯ 𝜓𝜖
o The sufficient condition for the mean of an AR(p) process to exist and
be finite is that the sum of the AR coefficients is less than one in
absolute value, , see next
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 9
Moments and ACFs of an AR(p) Process
The (unconditional) variance of an AR(p) process is computed from
Yule-Walker equations written in recursive form (see below)
o In the AR(2) case, for instance, we have
1 𝜙 𝜎
𝑉𝑎𝑟 𝑦
1 𝜙 1 𝜙 𝜙 1 𝜙 𝜙
o For AR(p) models, the characteristic polynomials are rather
convoluted – it is infeasible to define simple restrictions on the AR
coefficients that ensure covariance stationarity
o E.g., for AR(2), the conditions are 𝜙 𝜙 1, 𝜙 𝜙 1, |𝜙 | 1
The autocovariances and autocorrelations functions of AR(p)
processes can be computed by solving a set of simultaneous
equations known as Yule-Walker equations
o It is a system of K equations that we recursively solve to determine the
ACF of the process, i.e., 𝜌 for h = 1, 2, …
o See example concerning AR(2) process given in the lectures and/or in
the textbook
For a stationary AR(p), the ACF will decay geometrically to zero
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 10
ACF and PACF of AR(p) Process
The SACF and SPACF are of primary importance to identify the lag
order p of a process
o F 11
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin
ACF and PACF of AR(p) and MA(q) Processes
o As one would expect of an ARMA process, both the ACF and the PACF
decline geometrically: the ACF as a result of the AR part and the PACF
as a result of the MA part
o However, as the coefficient of the MA part is quite small the PACF
becomes insignificant after only two lags. Instead, the AR coefficient is
higher (0.7) and thus the ACF dies away after 9 lags and rather slowly
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 16
Model Selection: SACF and SPACF
A first strategy, compares the
sample ACF and PACF with the
theoretical, population ACF and
PACF and uses them to identify
the order of the ARMA(p, q) model
US CPI Inflation
Sample size
The SBIC is the one IC that imposes the strongest penalty (lnT) for
each additional parameter that is included in the model.
The HQIC embodies a penalty that is somewhere in between the
one typical of AIC and the SBIC
o SBIC is a consistent criterion, i.e.,
it determinates the true model
asymptotically
o AIC asymptotically overestimates
the order/complexity of a model
with positive probability
Estimation Methods: OLS vs MLE
o It is not uncommon that different criteria lead to different models
o Using the guidance derived from the inspection of the correlogram,
we believe that an ARMA model is more likely, given that the ACF does
not show signs of geometric decay
o Could be inclined to conclude in favor of a ARMA(2,1) for the US
monthly CPI inflation rate
The estimation of an AR(p) model because it can be performed
simply by (conditional) OLS
o Conditional on p starting values for the series
When an MA(q) component is included, the estimation becomes
more complicated and requires Maximum Likelihood
o Please review Statistics prep-course + see the textbook
However, this opposition is only apparent: conditional on the p
starting values, under the assumptions of a classical regression
model, OLS and MLE are identical for an AR(p)
o See 20191 for the classical linear regression model
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 20
Estimation Methods: MLE
The first step in deriving the MLE consists of defining the joint
probability distribution of the observed data
The joint density of the random variables in the sample may be
written as a product of conditional densities so that the log-like-
lihood function of ARMA(p, q) process has the form
o For instance, if 𝑦 has a joint and marginal normal pdf (which must
derive from the fact that 𝜖 has it), then
where
o For instance,
o The forecast error is
o The h-step forecast can be computed recursively, see the
textbook/class notes
For a stationary AR(p) model, 𝑦 ℎ converges to the mean 𝐸 𝑦 as
h grows, the mean reversion property
Lecture 3: Autoregressive Moving Average (ARMA) Models – Prof. Guidolin 28
Forecasting with MA(q)
Because the model has a memory limited to q periods only, the
point forecasts converge to the mean quickly and they are forced to
do so when the forecast horizon exceeds q periods
o E.g., for a MA(2),
because both shocks have been observed and are therefore known
o Because 𝜀 has not yet been observed at time t, and its expectation at
time t is zero, then
o By the same principle, because 𝜀 ,
𝜀 , and 𝜀 are not known at time t
By induction, the forecasts of an ARMA(p, q) model can be obtained
from
𝑦 ℎ 𝜙 𝜙 𝑦 ℎ 𝑖 𝜃 𝐸 𝜖
How do we assess the forecasting accuracy of a model?