Hints for the Exercises 1 Throughout these exercises, let (Xt )t≥0 be a Markov process with countable state space S = {xi : i ∈ N}, i.e.: • X takes only values in S and • ∀τ > 0, t ≥ 0, x ∈ S : P (Xt+τ = x|Xs , s ≤ t) = P (Xt+τ = xi |Xt ) First, we consider S = N0 and assume that pn (t) := P (Xt = n) satisfies the following transition rule: p˙n (t) = −λpn (t) + λpn−1 (t), p0 (0) = 1, pi (0) = 0 ∀i > 0. (1) In this case, X is called a Poisson counter or Poisson process with rate λ. (i) Write for arbitrary n ∈ N 1 p0 p˙0 0 p1 p˙1 .. = A · .. , p(0) = .. . . . pn p˙n (2) 0 with the correct martrix A ∈ Rn×n . Solve this ODE and recall that its solution is given by p(t) = exp(At) · p(0), (3) where exp(·) denotes the matrix exponential. (ii) P In the discrete case, expectation is particularly easy: E[Xt ] = ∞ n=0 npn (t). P∞ n=0 nP (Xt = n) = (iii) Compute the moment-generating function Mt of Xt for each t ≥ 0 and recall that E[Xtk ] = ∂k M (u) . t ∂uk u=0 Then compute the first two or three moments, for instance. Now assume S = Z and let X be bi-directional Poisson counter, i.e.: p˙i (t) = λpi−1 (t) − 2λpi (t) + λpi+1 (t), i ∈ Z, p0 (0) = 1, pi (0) = 0 ∀i 6= 0. (4) Compute the probability-generating function of Xt , namely gt (z) := E[z Xt ] = ∞ X z i pi (t), z ∈ C, |z| ≤ 1, z 6= 0. i=−∞ Obviously, pi (t) is the ith Taylor coefficient in the Taylor expansion of gt . In order to obtain this, show that ∂ gt (z) = λ(z − 2 + z −1 )gt (z) ∂t holds for every |z| ≤ 1, z 6= 0 and solve this differential equation. Finally, use the binomial formula during the expansion to compute pi (t). 1 Consider now stochastic differential equations driven by Poisson counters: Z t Z t g(Xs , s)dNs f (Xs , s)ds + Xt = X0 + (5) 0 0 or, in short, dX = f (X, t)dt + g(X, t)dN. (6) Note that (5) has to be understood in the pathwise sense, i.e., it holds for every ω ∈ Ω whereas in the Brownian case, a pathwise definition of the integral is not possible. Definition 0.1 A right-continuous process X is the solution of (5) in the Itˆo sense if X satisfies the following two properties: • On intervals where N is constant, we have X˙ t = f (Xt , t), and • if N jumps at time t, we have that ∆Xt := Xt − Xt− = g(Xt− , t). Here, Xt− := lims↑t Xs . (iv) Itˆo’s differential rule has the following form: if ψ is a C 1 -function, then Z t ψ(Xt ) − ψ(X0 ) = Z 0 0 t [ψ(Xs + g(Xs , s)) − ψ(Xs )]dNs ψ (Xs )f (Xs , s)ds + (7) 0 To show this, prove that ψ(X) satisfies Definition 0.1 if the right-hand side of (7) is given. (v) For this exercise, one has to assume the following: • There exists a differentiable function ρ : R+ × R → R+ such that P (Xt ∈ A) = R A ρ(t, x)dx for every t ≥ 0 and Borel set A. • The function g˜t : R → R, g˜t (x) := x + g(x, t), is bijective for every t ≥ 0. • f and g are both differentiable. To start with, use (5) to show that the evolution of the expectation of Xt is given by d E[Xt ] = E[f (Xt , t)] + E[g(Xt , t)] · λ. dt d Now take some arbitray C 1 -function ψ and use R Itˆo’s formula to compute dt E[ψ(Xt )]. Compare this with the fact that E[ψ(Xt )] = ψ(x)ρ(t, x)dx. Integration by parts and ∂ using the substitution dz = (1 + ∂x g(x, t))dx yield the rest. Now, we consider finite-state jump processes X, i.e. S = {x1 , . . . , xn } and the probabilities pi (t) satisfy (2) with some arbitrary matrix A ∈ Rn×n . A is called the infinitesimal generator or intensity matrix of X. Due to the non-negativity and the conservation of probability, A must satisfy the following conditions: Pn • i=1 aij = 0 for all j = 1, . . . , n and • aij ≥ 0 for all i, j = 1, . . . , n. (vi) In order to compute E[Xt Xt+τ ], observe that P (Xt+τ = xj |Xt = xi ) = (exp(Aτ ))ij for every i, j = 1, . . . , n. 2 √ (vii) Let (Yt )t≥0 be a bi-directional Poisson counter of rate λ/2 and let Xλ (t) := Yt / λ. Then, we can find Poisson counters N1 and N2 such that Xλ = λ−1/2 (N1 − N2 ). Use Itˆ o’s differential rule to prove 1 p 1 p p p p dXλ = Xλ + √ − Xλ dN1 + Xλ − √ − Xλ dN2 . λ λ Now compute p d dt E[Xλ (t)] for odd and even p separately. Deduce that for even p, we have lim E[Xλp (t)] λ→∞ 1 = 2 Z t p(p − 1) lim E[Xλp−2 (s)]ds λ→∞ 0 and solve these equations. Observe that these moments correspond to a N (0, t) distribud tion. Finally show that dτ E[Xλ (t)Xλ (τ )] = 0 for τ ≥ t and check the defining properties of Brownian motion. (viii) Let ψ be a C 2 function and X be a solution of Z t Z t g(Xs , s)dWs f (Xs , s)ds + Xt = X0 + (8) 0 0 where (Wt )t≥0 is a Brownian motion. Our aim is to derive Itˆo’s formula for Brownian motion, i.e.: 1 dψ(X) = ψ 0 (X)f (X, t)dt + ψ 0 (X)g(X, t)dW + ψ 00 (X)g 2 (x, t)dt 2 (9) To this end, first take Xλ as the solution of Z t Z t √ Xλ (t) = Xλ (0) + f (Xλ (s), s)ds + g(Xλ (s), s)(dN1 − dN2 )/ λ 0 0 with two independent Poisson counters N1 , N2 of rate λ/2. Use (7) to compute dψ(Xλ ) and use Taylor’s formula to derive √ dψ(Xλ ) = ψ 0 (Xλ )f (Xλ , t)dt + ψ 0 (Xλ )g(Xλ , t)(dN1 − dN2 )/ λ+ 1 + ψ 00 (X)g 2 (X, t)(dN1 + dN2 )/λ + O(λ−3/2 )dN1 2 Show that the variance of Zλ (t) = (N1 (t) + N2 (t))/λ is just t/λ and conclude that Zλ (t) tends to Z(t) = t for λ → ∞, which finishes the proof. 3
© Copyright 2024 ExpyDoc