Some Basic Time Series Models

Stat 565
Some Basic Time Series Models
Jan 21 2014
Charlotte Wickham
Monday, January 20, 14
stat565.cwick.co.nz
1
Weak Stationarity
A time series {xt} is weakly stationary if
it's mean function doesn't depend on
time, and it's autocovariance function
only depends on the distance between
the two time points,
μt = E[ xt ] = μ
Often rewrite as
ϒ(s, t) = Cov(xs, xt) = ϒ(t - s)
ϒ(h) = Cov(xt, xt+h)
xt assumed to have finite variance
Monday, January 20, 14
2
Autocorrelation
For a stationary process the
autocorrelation is:
Cor(xt, xt+h) = ρ(h) = ϒ(h) / ϒ(0)
Monday, January 20, 14
3
Some basic models
White noise
Random walk with drift
Moving average of order 1 MA(1)
Autoregressive of order 1 AR(1)
What is the mean function?
What is the autocovariance function?
Is the process weakly stationary?
Monday, January 20, 14
4
White noise
{ wt } is a white noise process if wt are
uncorrelated identically distributed
random variables with
E[wt] = 0 and Var[wt] =
2
σ,
for all t
If the wt are Normally (Gaussian)
distributed, the series is known as
Gaussian white noise.
Monday, January 20, 14
5
White noise
Simulated
2
σ
Monday, January 20, 14
=1
6
White noise
What is the mean function?
μt =E[wt] = 0
What is the autocovariance function?
ϒ(h) = { σ2, h = 1
{ 0, otherwise
Is white noise stationary?
Yes.
Monday, January 20, 14
7
Monday, January 20, 14
8
Random walk with drift
drift, a constant
xt = δ + xt-1 + wt
where {wt} is a white noise process, and
x0 = 0.
Can rewrite as:
xt = tδ +
Monday, January 20, 14
t
∑ j=0 wj
9
Random walk (drift = 0)
Simulated
Monday, January 20, 14
10
Random walk (drift = 0.1)
Simulated
Monday, January 20, 14
11
Your turn
xt = tδ +
What is the mean function?
Random walk with drift
t
∑ j=0 wj
μt = E[wt] = tδ
What is the autocovariance function?
ϒ(t, t+h) =
2
(t+1)σ
but I should have had:
xt = tδ + ∑tj=1 wj
=>
ϒ(t, t+h) = tσ2
Is the random walk model stationary?
No.
Monday, January 20, 14
12
Moving average MA(1)
xt = β1wt-1 + wt
where {wt} is a white noise process.
We'll see higher order MA processes later...
Monday, January 20, 14
13
MA(1) β1= 1
Simulated
Monday, January 20, 14
14
Your turn
xt = β1wt-1 + wt
What is the mean function?
MA(1)
What is the autocovariance function?
Is MA(1) stationary?
Monday, January 20, 14
15
MA(1) β1= 1
ACF for simulated data
Monday, January 20, 14
16
Autoregressive AR(1)
xt = α1xt-1 + wt
where {wt} is a white noise process.
We'll see higher order AR processes later...
Monday, January 20, 14
17
AR(1) α1= 0.9
Simulated
Monday, January 20, 14
18
AR(1) α1= 0.5
Simulated
Monday, January 20, 14
19
AR(1)
What is the mean function?
What is the autocovariance function?
Is AR(1) stationary?
Monday, January 20, 14
20
AR(1) α1= 0.9
ACF for simulated data
Monday, January 20, 14
21
Three stationary models
White noise
ρ(h) = 1, when h = 0
= 0, otherwise
MA(1), any β1
ρ(h) = 1, when h = 0
= β1/(1 + β12), h = 1
AR(1), |α1| < 1
ρ(h) = 1, when h = 0
= α1h, h > 0
= 0, h ≥ 2
Only lag 0 shows
non-zero ACF.
Monday, January 20, 14
Only lag 0 and 1
show non-zero
ACF.
Decreasing ACF
22
1
Which models might these
simulated data come from?
Monday, January 20, 14
2
3
4
5
23
A General Linear Process
A linear process xt is defined to be a linear
combination of white noise variates, Zt,
xt =
1
X
i Zt i
i=0
with
1
X
| i| < 1
i=0
Monday, January 20, 14
This is enough to
ensure stationarity
24
Autocovariance
One can show that the autocovariance
of a linear process is,
(h) =
2
1
X
i+h
i
i=0
Monday, January 20, 14
25
Your turn
Write the MA(1) and AR(1) processes in
the form of linear processes.
I.e. what are the ψj?
xt =
1
X
i Zt i
i=0
MA(1): xt = β1Zt-1 + Zt
AR(1): xt = α1xt-1 + Zt
Monday, January 20, 14
26
Verify the autocovariance functions for
MA(1) and AR(1)
(h) =
2
1
X
i+h
i
i=0
Monday, January 20, 14
27
Backshift Operator
The backshift operator, B, is defined as
Bxt = xt-1
It can be extended to powers in the
obvious way:
2
B xt
= (BB)xt = B(Bxt) = Bxt-1 = xt-2
So,
Monday, January 20, 14
k
B xt
= xt-k
28
Your turn
Write the MA(1) and AR(1) models using
the backshift operator.
MA(1): xt = β1Zt-1 + Zt
AR(1): xt = α1Zt-1 + Zt
Monday, January 20, 14
29
Difference Operator
The difference operator, ∇, is defined as,
∇d xt = ( 1 - B)d xt
(e.g. ∇1 xt = ( 1 - B) xt = xt - xt-1)
(1-B)d can be expanded in the usual way,
e.g. (1 - B)2 = (1 - B)(1 -B) = 1 - 2B + B2
Some non-stationary series can be made
stationary by differencing, see HW#2.
Monday, January 20, 14
30
Roadmap
Extend AR(1) to AR(p) and MA(1) to MA(q)
Combine them to form ARMA(p, q)
processes
Discover a few hiccups, and resolve them.
Then find the ACF (and PACF) functions for
ARMA(p, q) processes.
Figure out how to fit a ARMA(p,q) process to
real data.
Monday, January 20, 14
31
MA(q) process
A moving average model of order q is
defined to be,
xt = Zt +
1 Zt 1
+
2 Zt 2
+ ... +
q Zt q
where Zt is a white noise process with variance σ2,
and the β1,..., βq are parameters.
Can we write this using B?
Monday, January 20, 14
32
Moving average operator
✓(B) = 1 +
1B +
2
B
+ ... +
2
q
B
q
Will be important in deriving properties later,....
Monday, January 20, 14
33
AR(p) process
An autoregressive process of order p is
defined to be,
xt = ↵1 xt
1
+ ↵ 2 xt
2
+ . . . + ↵ p xt
p
+ Zt
where Zt is a white noise process with variance σ2,
and the α1,...,αp are parameters.
Can we write this using B?
Monday, January 20, 14
34
Autoregressive operator
(B) = 1
Monday, January 20, 14
↵1 B
↵2 B 2
...
↵p B p
35
MA(q): xt = θ(B)Zt
AR(p): ɸ(B)xt = Zt
✓(B) = 1 +
(B) = 1
Monday, January 20, 14
1B +
↵1 B
2
B
+ ... +
2
↵2 B
2
...
q
B
q
↵p B
p
36