AutoRegressive (AR) Moving Average Models
Introduction
- Definition
- An autoregressive model of order $p$, denoted by AR(p), is described by
- AR(p): $Xt = \phi0 + \phi1X{t-1}+… + \phipX{t-p} + W_t$
- p=1 -> AR(1): $Xt = \phi0 + \phi1X{t-1} + W_t$
- Where:
- $X_t$ is the value of a stationary time series at time $t$
- $\phi$ are coefficients, $\phi_p ≠ 0$
- $W_t$ is Gaussian white noise
AR models
- Only on stationary series
- “Auto” because it regresses itself
- AR(1)
- Case $|\phi_1| < 1$:
- Consider AR(1): $Xt = \phi0 + \phi1X{t-1} + Wt$ for $|\phi1| < 1$
- It can be shown that:
- $Xt = \frac{\phi0}{1-\phi1} + Wt + \phi1W{t-1} + \phi1^2W{t-2} + …$
- $E[Xt] = \frac{\phi0}{1-\phi_1}$
- Autocorrelation at lag h:
- $\rhoX(h) = \phi1^h$ for $h = 0, 1, 2, …$
- X is correlated with all lagged values of time series
- Case $|\phi_1| ≥ 1$:
- Consider AR(1): $Xt = \phi0 + \phi1X{t-1} + Wt$ for $|\phi1| ≥ 1$
- This cannot be represented by prev values of $W_t$
- This is called an explosive process
- Causal:
- Def:
- AR(p): $Xt = \phi0 + \phi1X{t-1}+… + \phipX{t-p} + Wt$ is causal if $Xt$ can be expressed as a linear combination of current and prev values of $W_t$ as:
- $Xt = \muX + Wt + c1W{t-1} + c2W_{t-2} + …$
- $\mu_X$ is the mean function of stationary time series
- An AR(p) is causal if and only if all roots of the polynomial equation
- $\phipz^p + … + \phi1z + \phi_0 = 0$
- have absolute values greater than 1
- the autocorrelation function of a causal AR(p) model is either exponential decay or a sinusoidal as lag increases
Moving Average models
- Def
- An moving average model of order q, denoted by MA(q), is described by
- MA(q): $Xt = \theta0 + \theta1W{t-1}+… + \thetaqW{t-q} + W_t$
- $\theta$ are coefficients, $\theta_q ≠ 0$
- W_t is Gaussian white noise
- Only on stationary series
- “Moving average” becayse $X_t$ can be represented as the weighted sum of some values of white noise series
- MA(1)
- COnsider MA(1): $Xt = \theta0 + \theta1W{t-1}+ W_t$
- $E[Xt] = \theta0$
- $\rho_X(h) = $
- $\frac{\theta1}{1+\theta1^2}$ for $h=1$; can be proved by using definition of autocovariance
- $0$ for $h = 2,3,…$
- $h=0$ is not considered because value is always 1
- For MA(q), $\rho_X$ = 0 from lag q+1 onwards
- On ACF graph, if $\rho$ from some lag suddenly becomes 0, then a MA model might be a fit, and the order would be that q
- Order estimation
- ACF can provide information about MA model because of above things, but cant provide information about AR models
- Then how can AR model’s information be gotten?
- use PACF
- it behaves on AR just like ACF for MA, drop to 0 from p+1 and onwards
- it behaves on MA just like ACF for AR
AR Moving Average models
- Def
- An ARMA model of orders p and q, denoted by ARMA(p,q), is described by
- ARMA(p,q): $Xt = \phi0 + \phi1X{t-1}+… + \phipX{t-p} + \theta1W{t-1}+… + \thetaqW{t-q} + W_t$
- p = 0 -> MA(q)
- q = 0 -> AR(p)
- Does NOT have cutoff point on ACF and PACF
AR ←→ MA
- A causal AR(p) model can be represented as an MA($\infin$) model
- So ACF for an AR model tails off, because it has infinate MA order
- Under mild conditions, MA(q) model can be represented as an AR($\infin$) model
- So PACF for MA model tails off, becayse it has infiniate AR order
- ARMA model display both components and exhibit a complex pattern that is a blend of AR and MA.
For a causal AR(1) model $Xt = \phi0+\phi1X{t-1}+Wt$ with $|\phi1| < 1$, it can be shown that:
$Xt = \frac{\phi0}{1-\phi1} + Wt + \phi1W{t-1} + \phi1^2W{t-2} + …$
Since
MA($\infin$) $ = \theta0 + Wt + \theta1W{t-1}+\theta2W{t-2} + …$
Let
$\theta0 = \frac{\phi0}{1-\phi1}, \theta1 = \phi1, \theta2 = \phi_1^2,…$
Then $Xt = $ MA($\infin$). $Xt$ is represented by a MA model with infinite order.
Comments are closed