Homework 1

AMS 263 — Stochastic Processes (Fall 2014)
Homework 1 (due Thursday October 23)
1. Consider real-valued random variables Ai , Bi , i = 1, ..., k, such that E(Ai ) = E(Bi ) = 0 and
Var(Ai ) = Var(Bi ) = σi2 > 0, for i = 1, ..., k. Moreover, assume they are mutually uncorrelated,
that is, E(Ai Al ) = E(Bi Bl ) = 0, forP
i 6= l, and E(Ai Bl ) = 0, for all i, l. Define the stochastic
process X = {Xt : t ∈ R} by Xt = ki=1 (Ai cos(wi t) + Bi sin(wi t)), where wi , i = 1, ..., k, are
real constants. Show that X is weakly stationary.
2. Consider a discrete-time real-valued stochastic process X = {Xn : n ≥ 1} defined by Xn =
cos(nU ), where U is uniformly distributed on (−π, π). Show that X is weakly stationary but
not strongly stationary.
3. Let {Zn }, for integer n, be a sequence of real-valued random variables with E(Zn ) = 0, Var(Zn ) =
1 and E(Zn Zm ) = 0, n 6= m. Consider a moving average process associated with {Zn }, that is,
a discrete-time real-valued process Y = {Yn }, with integer n, given by Yn = Zn + aZn−1 , where
a is a real constant. Show that Y is weakly stationary and find its covariance function. Obtain
the spectral density function of Y .
4. Consider a weakly stationary process X = {Xt : t ∈ R} with zero mean and unit variance. Find
the correlation function of X if the spectral density function f of X is given by:
(a) f (u) = (2π)−1/2 exp(−0.5u2 ), u ∈ R.
(b) f (u) = 0.5 exp(−|u|), u ∈ R.
5. Show that a Gaussian process is strongly stationary if and only if it is weakly stationary.
6. Let W = {Wt : t ≥ 0} be a continuous-time, real-valued Gaussian process such that:
(a) W0 = ω, where ω is a real constant.
(b) W has independent increments, that is, the random variables Wt1 − Ws1 ,...,Wtn − Wsn are
independent whenever the intervals (sj , tj ], j = 1, ..., n, are disjoint.
(c) Ws+t − Ws follows a N(0, σ 2 t) distribution, for all s, t ≥ 0, where σ 2 is a positive constant.
Define the continuous-time, real-valued stochastic process, X = {Xt : t ≥ 1}, by Xt = Wt −Wt−1 .
Show that X is strongly stationary and find its spectral density function.
7. Gaussian Markov processes. By definition, a continuous-time real-valued stochastic process
X = {Xt : t ∈ R} is called a Markov process if for all n, for all x, x1 ,...,xn−1 , and all increasing
sequences t1 < t2 < ... < tn of index points,
Pr(Xtn ≤ x | Xt1 = x1 , ..., Xtn−1 = xn−1 ) = Pr(Xtn ≤ x | Xtn−1 = xn−1 ).
Consider a (real-valued) Gaussian process Y = {Yt : t ∈ R}. Show that Y is a Markov process
if and only if E(Ytn | Yt1 = y1 , ..., Ytn−1 = yn−1 ) = E(Ytn | Ytn−1 = yn−1 ), for all n, y1 ,...,yn−1
and all increasing sequences t1 < t2 < ... < tn of index points.
8. Stationary Gaussian Markov processes. Consider a continuous-time real-valued stochastic
process X = {Xt : t ≥ 0}, which is assumed to be Gaussian, stationary (with mean 0), and
Markov. Show that the covariance function c(·) of X satisfies the functional equation
c(0)c(t1 + t2 ) = c(t1 )c(t2 ), ∀ t1 , t2 ≥ 0.