Lecture 3: Some Time Series Models

Lecture 3: Some Time Series Models
Michael Levine1
Purdue University
January 20, 2014
1
These notes owe a lot to Prof. Walid Sharabati and Prof. Bo Li
Michael Levine (Purdue)
Time Series
January 20, 2014
1 / 22
Stochastic (Random) Process
For each t, Xt is treated as a value of the random variable
Xt , 0 ≤ t ≤ T .
Sometimes we write X(t) if t is continuous instead of Xt .
An observed record is simply one record out of a whole collection of
possible records which we might have observed.
This collection is called ensemble.
Each particular record is a realization of the random process.
In practice we need to evaluate properties of the underlying probability
model from a single realization!
Michael Levine (Purdue)
Time Series
January 20, 2014
2 / 22
All of them together are ensemble, each of them is a realization.
Michael Levine (Purdue)
Time Series
January 20, 2014
3 / 22
Review of Stochastic Processes
Definition
A stochastic process X is a collection of random variables
(Xt , t ∈ T ) = (Xt (ω), t ∈ T, ω ∈ Ω) ,
defined on some probability space Ω.
X could be a continuous-time process or discrete-time process
A process Xt is fully defined if all of its finite dimensional
distributions (Xt1 +τ , . . . , Xtk +τ ) for all possible τ > 0 and any
selection of t1 , . . . , tk with integer k > 0 are known
Michael Levine (Purdue)
Time Series
January 20, 2014
4 / 22
Stochastic (Random) Process I
Mean, Variance, Autocovariance, and Autocorrelation
The mean function µt or µ(t) is defined ∀ t by
Z ∞
X · ft (X) dX.
µt = E (Xt ) =
−∞
The variance function σt2 or σ 2 (t) is defined ∀ t by
σt2 = V ar (Xt ) = E (Xt − µt )2 = E Xt2 − µ2t .
Michael Levine (Purdue)
Time Series
January 20, 2014
5 / 22
Stochastic (Random) Process II
The autocovariance function (acv.f.) γt1 ,t2 or γ(t1 , t2 ) of Xt1 with
Xt2 is defined by
γt1 ,t2 = E [(Xt1 − µt1 )(Xt2 − µt2 )]
Z Z
=
(X1 − µt1 )(X2 − µt2 ) · ft1 ,t2 (X1 , X2 ) dX1 dX2 .
When t = t1 = t2 we get V ar(Xt ) = σt2 .
The autocorrelation function (ac.f.) ρτ is defined by
ρτ =
Michael Levine (Purdue)
Time Series
γτ
.
γ0
January 20, 2014
6 / 22
Stationary Time Series I
Strictly Stationary
The overall behavior of random process Xt is described by a point
distribution function of the process {Xt1 , Xt2 , · · · , Xtk } at finite number
of points t1 , t2 , · · · , tk for any positive integer k
This function is
Ft1 ,t2 ,··· ,tk (X1 , X2 , · · · , Xk ) = P (Xt1 < X1 , · · · , Xtk < Xk ).
Definition
A time series Xt is strictly stationary if {Xt1 , Xt2 , · · · , Xtk } and
{Xt1 +τ , Xt2 +τ , · · · , Xtk +τ } have the same point distribution for any
positive integer n ≥ 1 and any integer τ (t1 , t2 , · · · , tn , τ ), i.e. the joint
distribution function is invariant under time shifts.
Michael Levine (Purdue)
Time Series
January 20, 2014
7 / 22
Stationary Time Series II
The simplest model for a time series is the iid noise (all observations are
independent and identically distributed). Then,
Ft1 ,t2 ,··· ,tn (X1 , X2 , · · · , Xn ) = P (Xt1 < X1 , · · · , Xtn < Xn )
= P (Xt1 < X1 ) · P (Xt2 < X2 ) · · · · · P (Xtn < Xn )
= Ft1 · Ft2 · · · · · Ftn
= F (X1 ) · F (X2 ) · · · · · F (Xn ).
iid
noise t ∼ N µ, σ 2 .
Cov(t , t+τ ) =
Michael Levine (Purdue)
0,
t2 ,
|τ | =
6 0.
τ = 0.
→
Time Series
because they are independent
January 20, 2014
8 / 22
Second-Order Stationary
A process is called 2nd order stationary (or weakly stationary) if its mean
is constant and its acv.f. depends only on the lag, so that
E(Xt ) = µ, and
Cov (Xt , Xt+τ ) = γτ .
Note, by letting τ = 0 we get V ar(Xt ) = σt2 , which is also a constant.
This means that the mean and variance must be finite.
Strictly Stationary ⇒ 2nd Order Stationary as long as E Xt2 < ∞
Michael Levine (Purdue)
Time Series
January 20, 2014
9 / 22
Example
1 Show that a strictly stationary process with E Xt2 < ∞ is weakly
stationary.
Z ∞
Z ∞
Xft (X) dX =
Xf0 (X) dX = E(X0 ) = µ.
E(Xt ) =
−∞
−∞
Z Z
Cov (Xt , Xt+τ ) =
(Xt −µt )(Xt+τ −µt+τ )·ft,t+τ (Xt , Xt+τ ) dXt dXt+
Z Z
=
(X0 −µ0 )(Xτ −µτ )·f0,τ (X0 , Xτ ) dX0 dXτ = Cov (X0 , Xτ ) = γτ .
i.e. weakly stationary.
Michael Levine (Purdue)
Time Series
January 20, 2014
10 / 22
iid
2 If Xt = µ + Zt + βZt−1 , where µ is a constant, Zt ∼ with
E(Zt ) = 0, V ar(Zt ) = σz2 . Find γτ .
Michael Levine (Purdue)
Time Series
January 20, 2014
11 / 22
Gaussian (Normal) Stochastic Process
Definition
The joint distribution of Xt1 , Xt2 , · · · , Xtk is multivariate normal for all
t1 , t2 , · · · , tk .
the multivariate normal distribution is completely characterized by its
1st and 2nd moments, so that 2nd order stationary ⇔ strictly
stationary for normal processes.
Strictly Stationary ⇒ 2nd Order Stationary.
Strictly Stationary 6⇐ 2nd Order Stationary.
⇐ if Xt is a normal process.
Michael Levine (Purdue)
Time Series
January 20, 2014
12 / 22
Michael Levine (Purdue)
Time Series
January 20, 2014
13 / 22
A sample path of the Gaussian process (Xt , t ∈ [0, 1000]), where the Xt ’s
are iid N (0, 1). The expectation function is E(Xt ) = µX (t) = 0 and the
variance is V ar(Xt ) = 1.
Michael Levine (Purdue)
Time Series
January 20, 2014
14 / 22
Michael Levine (Purdue)
Time Series
January 20, 2014
15 / 22
Homogeneous Poisson Process
Definition
A stochastic process (Xt , t ∈ [0, ∞)) is called an homogeneous Poisson
process or simply a Poisson process with rate λ > 0 if the following
conditions are satisfied:
It starts at zero: X0 = 0.
It has stationary, and independent increments.
For every t > 0, Xt has a Poisson P oi(λt) distribution.
Michael Levine (Purdue)
Time Series
January 20, 2014
16 / 22
Alternative Definition
Xt = #{n : Tn ≤ t},
t > 0,
where #A denotes the number of elements of any particular set A,
Tn = Y1 + · · · + Yn and {Yi } is a sequence of iid exponential Exp(λ)
random variables with common distribution function
P (Yi ≤ x) = 1 − e−λx ,
x ≥ 0.
Example
telephone calls to be handled by an operator.
customers waiting for service in a queue.
claims arriving in an insurance portfolio.
Michael Levine (Purdue)
Time Series
January 20, 2014
17 / 22
Michael Levine (Purdue)
Time Series
January 20, 2014
18 / 22
Brownian Motion
Definition
A stochastic process B = (Bt , t ∈ [0, ∞)) is called Brownian motion or a
Wiener process if the following conditions are satisfied:
It starts at zero: B0 = 0.
It has stationary, and independent increments.
For every t > 0, Bt has a normal N (0, t) distribution.
It has continuous sample paths: “no jumps”.
Distribution, Mean and Covariance
∀s < t, Bt − Bs and Bt−s have N (0, t − s).
µB (t) = 0 and covB (t, s) = min(s, t).
Michael Levine (Purdue)
Time Series
January 20, 2014
19 / 22
Properties of Brownian Motion
The paths of the Brownian motion are continuous, but
non-differentiable; they are irregular and oscillate wildly.
Adjacent intervals are independent whatever the length of the interval.
Brownian motion is 0.5-self-similar,
1
1
d
T 2 Bt1 , · · · , T 2 Btn = (BT t1 , · · · , BT tn ), ∀ T > 0, ti ≥ 0, n ≥ 1.
Hence, its sample paths are nowhere differentiable. Self-similarity is a
distributional, not a pathwise property.
For a given Brownian sample path, the shapes of the curves on
different intervals look similar, but they are not scaled copies of each
other.
Michael Levine (Purdue)
Time Series
January 20, 2014
20 / 22
Brownian Motion with Drift
Consider the process
Xt = µt + σBt ,
µ ∈ R,
σ > 0,
t ≥ 0.
Xt is a Gaussian process, verify!
µX (t) = µt,
covX (t, s) = σ 2 min(t, s),
s, t ≥ 0.
Xt is called a Brownian motion with drift.
Michael Levine (Purdue)
Time Series
January 20, 2014
21 / 22
Michael Levine (Purdue)
Time Series
January 20, 2014
22 / 22