Assignment 1

Problem 1. Suppose that there are 20 different types of coupons and you wish to collect all of them.

You collect one coupon every day, and it is equally likely for you to collect any of the 20 types.

(a) What is the expected number of distinct coupon types that you obtain in 60 days?

(b) Let X be the number of coupons that you collect until you have all 20 types. What is the expected

value of X? What is the variance of X?

Problem 2. At a certain stage of a criminal investigation, the inspector in charge is 60% convinced

of the guilt of a certain suspect. Suppose now that a new piece of evidence that there is a new piece of

evidence that shows the criminal is left-handed. If 20% of the population is left-handed, how certain of

the guilt of the suspect should the inspector now be, if it turns out that the suspect is also left-handed?

State clearly any assumptions that you make.

Problem 3.

(a) Show that

n

0

+

n

1

+ · · · +

n

n

= 2n

,

where n

k

= n!/[(n ? k)!k!].

(b) X be a negative binomial random variable with parameters (r, p) and Y a binomial r.v. with

parameters (n, p). Show that

P(X > n) = P(Y < r).

Problem 4. Random variables X and Y are said to have a bivariate normal distribution with parameters

(µ1, µ2, ?2

1

, ?2

2

, ?) if their joint density function f(x, y) is given by

f(x, y) = 1

2??1?2

p

1 ? ?

2

e

? 1

2

x?µ1

?1

2

?2?

x?µ1

?1

·

y?µ2

?2

+

y?µ2

?2

2

/(1??

2

)

, x, y ? R.

What is the conditional probability density function of X given Y = y? What about that of Y given

X = x?

Problem 5. For Unif(0, 1) random variables U1, U2, . . . define

N1 = Minimum (

n :

Xn

i=1

Ui > 1

)

.

That is, N1 is equal to the number of random numbers that must be summed to exceed 1. Estimate

E[N1] by generating 100, 1000 and 10,000 values of N1 respectively. What do you think is the value of

E[N1]?

Do the same for N2, the number of random numbers that must be summed to exceed 2. That is,

N2 = Minimum (

n :

Xn

i=1

Ui > 2

)

,

and estimate E[N2] by generating 100, 1000, and 10,000 values of N2 respectively.

1-1Problem 6. (Exercise 10 in Chapter 4, “Simulation” by Ross, 4th edition) In this problem we will

generate a negative binomial random variable with parameters (r, p) in three different ways. (You do

not need to implement your codes, but you need to present your precise algorithms and provide clear

reasoning.)

Recall that a NB(r, p) r.v. has p.m.f given by

p(k) =

k ? 1

r ? 1

p

r

(1 ? p)

k?r

, k = r, r + 1, . . .

(a) Use the relationship between NB(r, p) and Geom(p), and the relationship between Geom(p) and

Unif(0, 1) taught in class (or equivalently, in Example 4d, Chapter 4 of the textbook) to obtain

an algorithm to generate NB(r, p).

(b) Verify the relation

p(k + 1) = k(1 ? p)

k + 1 ? r

pk.

(c) Use the relation in part (b) to give a second algorithm for generating NB(r, p).

(d) Using the interpretation that NB(r, p) counts the number of i.i.d Bern(p) trials required to accumulate

r successes, obtain yet another approach for generating NB(r, p).

Problem 7. (Bonus) Verify the conditional variance formula. Namely, for any two random variables

X and Y , show that

V ar(X) = E[V ar(X | Y )] + V ar(E[X | Y ]).

1-2