Homework 1

Read WMS Sections 6.1, 6.2, 6.3, 6.4(pages 310–313 only), and 6.5. This reviews material
from STAT 416. In addition, review carefully the background for any techniques that seem
unfamiliar, particularly WMS Sections 3.1 to 3.3 and 4.1 to 4.3 on random variables and
WMS Sections 5.6 to 5.8 on means and variances of sums of random variables. Facts about
some standard families of distributions appear on the rear end papers of the text, and, with
added notes, on the “Common Distributions” handout.
1. WMS Exercise 6.1. (6.1 in the 6th ed.) Make a sketch of the density function f , find the
distribution function F of Y , and sketch it also. For each of parts (a), (b), and (c), sketch
both the distribution and density functions of the new random variable. Parts (d) and (e)
ask you to find the expected value of a function U = g(Y ) of the random variable Y in two
E[U ] = E[g(Y )] =
E[U ] = E[g(Y )] =
ufU (u)du
(first find the density of U )
g(y)fY (y)dy
(use the fact U = g(Y ))
2. Suppose that X1 , X2 , . . . , Xn are a random sample from the exponential distribution with
mean β. (This means that the Xi are independent random variables, each of which has this
distribution.) Find the density function of the largest observation, Y = max Xi .
3. WMS Exercise 6.20. (6.16 in the 6th ed.) (Find the density function in each part.)
4. WMS Exercise 6.28 and 6.33. (6.24 and 6.29 in the 6th ed.)
5. (a) Suppose that X is a Bernoulli random variable. That is, X takes only the values 1
(success) and 0 (failure), and for some probability p of a success (0 < p < 1),
P (X = 1) = p
P (X = 0) = 1 − p
Find the mgf of X. Find the mean E(X) and the second moment E(X 2 ) in two ways: from
the definition of expected value E( ), and by differentiating the mgf of X. What is therefore
the variance V (X) of X?
(b) A binomial random variable Y is the count of successes in n independent trials, each
having the same probability p of a success. That is,
Y = X1 + X2 + · · · + Xn
where the Xi are independent Bernoulli rv’s. Use this to find the mgf of Y .
(c) Find the mean and variance of Y in two ways: from the mean and variance of X using
the rules for means and variances of sums of independent rv’s, and by differentiating the mgf
of Y .
(d) Suppose that Y1 , Y2 , . . . , Yk are independent binomial rv’s all with the same p but with
numbers of trials n1 , n2 , . . . , nk that may differ. Find the mgf of W = Yi . What is the
distribution of W ?
6. If you need to review the geometric and negative binomial distributions, read WMS
Sections 3.5 and 3.6. The geometric distribution with parameter p is the distribution of the
number of trials to the first success when trials are independent and each has probability p
of success. The negative binomial distribution with parameters p and r is the distribution
of the number of trials to the rth success. So it should be true that if X1 , X2 , . . . , Xr are
independent geometric rv’s all with parameter p, then
Y = X 1 + X2 + · · · + Xr
has the negative binomial distribution with parameters p and r. Use the mgf method to
show that this is true.
7. WMS Exercise 6.57. (6.49 in the 6th ed.)