```Probability Theory and Mathematical Statistics
Problem 1. Let U = min(X, Y ) and V = max(X, Y ), where X and Y are independent
U(0, 1) random variables.
(i) Compute the cdf’s FU (u) and FV (v) and hence derive the pdf’s fU (u) and fV (v)
of the random variables U and V , respectively.
(ii) Compute the joint cdf F(U,V ) (u, v), 0 ≤ u ≤ v ≤ 1, of the bivariate random variable
(U, V ) and use it to derive the pdf f(U,V ) (u, v).
(iii) Are the random variables U and V independent? Explain!
Problem 2. Suppose that a random variable X has the following pdf
 x 1
 4 + 2 if − 2 6 x 6 0,
1
− x if 0 6 x 6 2,
fX (x) =
 2 4
0
elsewhere.
(i) Sketch the graph of fX .
(ii) Find the cdf FX of X and sketch its graph.
(iii) Find E(X), E(X 2 ), Var(X) and the third quartile of X.
(iv) Find the mgf MX (t) of X, then find the value MX (0). Further, calculate the value
of limt→0 MX (t).
Problem 3. Let U be a uniformly distributed random variable on the interval [0, 1]. Let
2U
X = − ln
.
1+U
(i) Give the set SX of possible values of X.
(ii) Find the cdf FX (x) of X.
(iii) Deduce from (2) the expression of the pdf fX (x) of X.
(iv) Calculate E(X).
Problem 4. A cat and a dog live in a square room. The corners of the room are numbered
1, 2, 3 and 4. Each night, the cat decides at random at which corner it will sleep. The
dog joins the cat in that corner with probability 1/2, but otherwise it chooses one of the
remaining three corners at random. Denote by X the number of the corner the cat will
sleep at, and by Y the number of the dog’s corner.
1
(i) Give the pmf p(X,Y ) (x, y) and the marginal distributions of the bivariate random
variable (X, Y ) in a table form. (Hint: think conditional probability first).
(ii) Using the table from (1), find the probability P(X 6 3, Y 6 2).
(iii) Find the marginal distribution of Y .
(iv) Compute E(X − Y ).
(v) Compute the correlation coefficient ρ between X and Y .
Problem 5. Let (X, Y ) be a continuous bivariate random variable with joint pdf
f (x, y) = 8xy,
0 6 x 6 y 6 1.
(i) Sketch the graph of the support of X and Y .
(ii) Find fX (x), the marginal pdf of X.
(iii) Find fY (y), the marginal pdf of Y .
(iv) Compute E(X), E(Y ), Var(X), Var(Y ), Cov(X, Y ), and ρ.
(v) Find fX|Y (x|y = 1/2), the conditional pdf of X given Y = 1/2.
(vi) Find P X < 43 | Y = 21 .
(vii) Find P X < 34 | Y < 21 .
Problem 6. Let X1 , . . . , Xn be a RS from a B(p) distribution. Find the maximum
likelihood estimator of p. Is this estimator efficient?
Problem 7. Let X, Y and Z be independent random variables, with known mgf’s MX (t)
and MY (t) and with P(Z = 1) = 1 − P(Z = 0) = p ∈ (0, 1). Compute the mgf of the
random variable
S = ZX + (1 − Z)Y.
Problem 8. Suppose you choose a point x at random in the interval [10, 18]. In other
words, x is an observed value of a random variable X uniformly distributed between 10
and 18. The random variable X divides the interval [10, 18] into two subintervals, of
lengths X − 10 and 18 − X, respectively. Denote by Z the length of the shorter of the
two intervals. Express Z in terms of X. What are the possible values for Z? Find the
probability P(Z > z). Then find both the cdf and pdf of Z, and give the name of the
distribution of Z if it has a name.
Problem 9. Suppose we have 4 coins labelled 1 to 4, such that the probability that coin
i shows a Head(H) is i/4, for i = 1, . . . , 4:
Coin
Coin
Coin
Coin
1:
2:
3:
4:
H
H
H
H
with
with
with
with
probability
probability
probability
probability
1/4
1/2
3/4
1
2
Tail(T) with probability 3/4
T with probability 1/2
T with probability 1/4
T with probability 0.
Consider the following events Bi = {Choose coin i}, for i = 1, . . . , 4.
(i) Experiment 1 At each throw, choose at random one of the 4 coins. Let N1 be
the number of throws before the first H turns up. Give the name of the distribution
of N1 and specify its parameter(s). Compute the probability that this experiment
must be repeated at least 4 times before obtaining the first H.
(ii) Experiment 2 At the beginning of the experiement, choose one coin at random,
and throw it until a head comes up. Let N2 be the number of throws before
obtaining the first head. Find the pmf of N2 , namely pN2 (k) = P(N2 = k), for
k = 1, 2, . . .
Problem 10. Let X1 , . . . , Xn be a sample of independent and identically distributed
random variables from an exponential distribution with parameter λ = 1/10. Let Sn be
the sum of X1 , . . . , Xn ,
Sn = X1 + . . . + Xn ,
¯ the sample mean of X1 , . . . , Xn ,
and denote by X
¯ = Sn = X1 + . . . + Xn .
X
n
n
(i) Find the mgf MSn (t) of the sum Sn . Name the distribution of Sn and specify the
associated parameter values.
(ii) Find the mgf MX¯ (t) of the sample mean X. Name the distribution of Sn and
specify the associated parameter values.
(iii) Find the limiting mgf,
lim MX¯ (t),
n→+∞
using the result of (b). What distribution does the limiting mgf correspond to?
What is the implication of this result?
(iv) Let
√
√
n¯
X − 2 n.
5
Find MZn (t), the mgf of Zn . Then find lim MZn (t). Finally, give the limiting
Zn =
n→+∞
distribution of Zn when n → ∞.
Problem 11. Let X1 , X2 and X3 be three independent random variables geometrically
distributed, with parameters p1 ∈ (0, 1), p2 ∈ (0, 1), and p3 ∈ (0, 1) respectively,
X1 ∼ Geom(p1 ),
X2 ∼ Geom(p2 )
Let X = min(X1 , X2 , X3 ).
(i) What are the possible values of X?
(ii) (a) Let x > 0. Compute P(X > x).
3
and
X3 ∼ Geom(p3 ).
(b) Deduce from (i) the name of the distribution of X, and specify its parameter(s).
(c) Suppose p1 = 0.4, p2 = 0.5, and p3 = 0.1. Compute P(X = 3).
d
Problem 12. Let X = U(−2, 2). Find fX (x | X < 1/2) and E(X | X < 1/2).
4
```