Problem 1. Suppose that there are 20 different types of coupons and you wish to collect all of them.
You collect one coupon every day, and it is equally likely for you to collect any of the 20 types.
(a) What is the expected number of distinct coupon types that you obtain in 60 days?
(b) Let X be the number of coupons that you collect until you have all 20 types. What is the expected
value of X? What is the variance of X?
Problem 2. At a certain stage of a criminal investigation, the inspector in charge is 60% convinced
of the guilt of a certain suspect. Suppose now that a new piece of evidence that there is a new piece of
evidence that shows the criminal is left-handed. If 20% of the population is left-handed, how certain of
the guilt of the suspect should the inspector now be, if it turns out that the suspect is also left-handed?
State clearly any assumptions that you make.
(a) Show that
+ · · · +
= n!/[(n ? k)!k!].
(b) X be a negative binomial random variable with parameters (r, p) and Y a binomial r.v. with
parameters (n, p). Show that
P(X > n) = P(Y < r).
Problem 4. Random variables X and Y are said to have a bivariate normal distribution with parameters
(µ1, µ2, ?2
, ?) if their joint density function f(x, y) is given by
f(x, y) = 1
1 ? ?
, x, y ? R.
What is the conditional probability density function of X given Y = y? What about that of Y given
X = x?
Problem 5. For Unif(0, 1) random variables U1, U2, . . . define
N1 = Minimum (
Ui > 1
That is, N1 is equal to the number of random numbers that must be summed to exceed 1. Estimate
E[N1] by generating 100, 1000 and 10,000 values of N1 respectively. What do you think is the value of
Do the same for N2, the number of random numbers that must be summed to exceed 2. That is,
N2 = Minimum (
Ui > 2
and estimate E[N2] by generating 100, 1000, and 10,000 values of N2 respectively.
1-1Problem 6. (Exercise 10 in Chapter 4, “Simulation” by Ross, 4th edition) In this problem we will
generate a negative binomial random variable with parameters (r, p) in three different ways. (You do
not need to implement your codes, but you need to present your precise algorithms and provide clear
Recall that a NB(r, p) r.v. has p.m.f given by
k ? 1
r ? 1
(1 ? p)
, k = r, r + 1, . . .
(a) Use the relationship between NB(r, p) and Geom(p), and the relationship between Geom(p) and
Unif(0, 1) taught in class (or equivalently, in Example 4d, Chapter 4 of the textbook) to obtain
an algorithm to generate NB(r, p).
(b) Verify the relation
p(k + 1) = k(1 ? p)
k + 1 ? r
(c) Use the relation in part (b) to give a second algorithm for generating NB(r, p).
(d) Using the interpretation that NB(r, p) counts the number of i.i.d Bern(p) trials required to accumulate
r successes, obtain yet another approach for generating NB(r, p).
Problem 7. (Bonus) Verify the conditional variance formula. Namely, for any two random variables
X and Y , show that
V ar(X) = E[V ar(X | Y )] + V ar(E[X | Y ]).