Exercises (Sheet #4)

Advanced Statistics
Prof. Dr. Bernd Wilfling
Dipl.-Volksw. Sarah Meyer
Winter Term 2014/2015
Exercises (Sheet #4)
1. Let X be a 2-dimensional random vector X = (X, Y )0 with a multivariate
nor’ 2
“
σ
σ
XY
0
X
mal distribution. Let µ = (µX , µY ) be the expectation and Σ =
σY X σY2
the covariance-matrix of X where σXY = σY X = Cov (X, Y ). Further let ρ =
Corr (X, Y ) be the correlation coefficient of X and Y . Show by means of the definition of the multivariate normal distribution that the joint probability density
function of X and Y is given by
š
1
1
p
exp −
fX,Y (x, y) =
2
2(1 − ρ2 )
2πσX σY 1 − ρ
•›
”
(x − µX )2 2ρ(x − µX )(y − µY ) (y − µY )2
−
+
.
×
2
σX
σX σY
σY2
2. Consider the situation of Exercise 1. Prove the following statement:
“X and Y are independent if and only if ρ(X, Y ) = 0.”
3. Consider the situation of Exercise 1. Prove that the following statements hold for
the conditional distributions of X and Y :
“
’
σX
2
2
X|Y = y ∼ N µX + ρ (y − µY ), σX (1 − ρ ) ,
σY
Y |X = x ∼ N
’
“
σY
2
2
µY + ρ (x − µX ), σY (1 − ρ ) .
σX
4. Let X1 , X2 , X3 be uncorrelated random variables with identical variance σ 2 . Find
the correlation coefficient of X1 + X2 and X2 + X3 .
5. Let X1 and X2 be uncorrelated random variables with variance σi2 , i = 1, 2. Find
the correlation coefficient of X1 + X2 and X2 − X1 .
1
6. Let X1 and X2 be i.i.d. random variables with probability density function
(
1
fXi (x) =
0
for x ∈ [0, 1]
for i = 1, 2.
elsewise
Find the density function of Y = X1 + X2 .
7. Let X be a continuous random variable with probability density function fX (x)
and distribution function FX (x). Consider the function g(x) = exp(ax + b). Find
the probability density function and the distribution function of Y = g(X).
8. Assume that X1 , ..., Xn are independent random variables with Xi ∼ N (µi , σi2 ).
Furthermore, let a1 , ..., an ∈ R be constants. Find the distribution of the weighted
P
sum Y = ni=1 ai Xi .
9. Let X1 , . . . , Xn be an i.i.d. sample with unknown distribution. Let further µ and
σ 2 < ∞ be the expectation and variance of that distribution and a1 , . . . an ∈ R
Pn
constants satisfying i=1
ai = 1.
(a) Prove that µ
ˆ=
Pn
i=1
ai Xi is an unbiased estimator of µ.
(b) Prove that for n = 2 the variance σ
ˆµ2 of µ
ˆ is minimized by a1 = a2 = 21 .
2