Time series statistical models
5-6 Intro
- Adjecent data points in time series have strong correlation
- At discrete time points (t=0, ±1, ±2…) we make observstion ($X0$, $X{±1}$, $X_{±2}$, …) about something
- $X$ is the random variable itself
- $x0$, $x{±1}$, … is also time series
- t=0 point is decided by our own
7 White noise series
- $W0, W{±1}, W_{±2}$, …
- $Wt ∼ WN[0, \sigmaW^2]$
- WN → N: Gaussian white noise
- All W_t are uncorrelated and independent
Measures of dependence
Mean function
- Mean is evaluated at a given TIME POINT
- On that time point t, mean is calculated over $X_t$ by using integral
- eg. The average number of patients on Tuesday
- Random walk model
- Expression:
- $X_0 = 0$ for $t=0$
- $Xt = X{t-1}+W_t$ for $t = 1, 2, …$
- $Wt ∼ N[0, \sigmaW^2]$
- $\mu = 0$
Autocorrelation function
- $s, t$: two time points
- Autocovariance
- $\gammaX(s,t) = Cov(Xs, Xt) = E[(Xs-\muX(s))(Xt-\mu_X(t))]$
- When s=t, $\gammaX(t,t) = Var(Xt)$
- 12 Autocorrelation
- $\Large\rhoX(s,t) = \frac{\gammaX(s,t)}{\sqrt{\gammaX(s,s)\gammaX(t,t)}}$
- When s=t, $\rho_X(t,t) = 1$
- if $\rho$ always approach 1 for some points $s < t$, then $Xt$ can be predicted by $Xs$ by a linear model $Xt = \beta0+\beta1Xs+Wt$ , $Wt~N(0,\sigma_W^2)$
- 13 Crosscorrelation
- $\Large\rho{X,Y}(s,t) = \frac{\gamma{X,Y}(s,t)}{\sqrt{\gammaX(s,s)\gammaY(t,t)}}$
- if $\rho$ always approach 1 for some points $s < t$, then $Yt$ can be predicted by $Xs$ by a linear model $Yt = \beta0+\beta1Xs+Wt , Wt~N(0,\sigma_W^2)$
Stationary time series
Criteria
- $\mu_X(t)$ is constant
- $\gamma_X(s,t)$ is solely determined by $h = |s-t|$
- when s=t, the result is var, and it is constant, not dependent on t
- The key issue is to decouple the role of $t$
Characteristics
- Auto…shits at lag h, derived from 11-13 part:
- $\gammaX(h) = Cov(X{t+h}, X_t)$
- $Var(Xt) = \gammaX(0)$
- $\Large\rho_X(h) = \frac{\gamma(h)}{\gamma(0)}$
- Gaussian white noise is stationary because:
- $\mu=0$, by definition
- $\gamma(s,t)$:
- s=t:
- $\gamma(s,t) = Var(Wt) = \sigmaW^2$
- s≠t:
- $\gamma(s,t) = Cov(Xs, Xt) = 0$ because they are uncorrelated
- Random walk model is not stationary because:
- $\mu=0$, by definition
- $\gamma(t,t) = Cov(Xt,Xt) = Var(Xt) = Var(W1+…+Wt) = Var(W1) + … + Var(Wt) + Cov(W1,W2) + … + Cov(Wt-1,Wt) = t*\sigma^2 + 0$
- gamma thing is dependent of time t so its not stationary
- Homoskedasticity: variance is constant over time
- Heteroskedasticity: variance changes over time
- 23 Statistical hypothesis test
- 24 Jointly stationary time series
- 25 example:
- $|\rho|$ is largest at $h=-6$, $Yt = \beta0 + \beta1*X{t-6} + W_t$
- Can predict Yt based on Xt from 6 months ago
Ass 1
- $\mu_X(t)$
- $\muX(t) = E(Xt) = E(W0 + W1 + … + Wt) = 0$
- Satisfies this criteria
- $\gamma(s,t)$ Let s=t. Since $W_t$Then: $\gamma(s,t)$ $= Cov(Xt,Xt)$ $= Var(X_t)$ $= Var(W0+…+Wt) $ $= Var(W0) + … + Var(Wt) + Cov(W0,W1) + Cov(W0,W2) + … + Cov(W{t-1},Wt)$
$= (t+1)\sigma_W^2 + 0$
$\gamma(s,t)$ is dependent on $t$ when $s=t$
In conclusion, that time series model is NOT stationary.
Comments are closed