Basic Probability Exam Packet

44
DEPARTMENT OF MATHEMATICS AND STATISTICS UMASS - AMHERST BASIC EXAM - PROBABILITY January 2013 Work all problems. Show all work. Explain your answers. State the theorems used when- ever possible. 60 points are needed to pass at the Masters Level and 75 to pass at the Ph.D. level. 1. Let Y 1 and Y 2 have the joint probability density function given by f (y 1 ,y 2 )= c(2 - y 1 ) for 0 y 2 y 1 2. For the following parts, you can leave your answers in terms of integrals with explicit limits. No need to give the final numerical answers. (a) (6 points) Find c. (b) (6 points) Find the marginal density functions for Y 1 and Y 2 . (c) (6 points) Are Y 1 and Y 2 independent? Why or why not? (d) (6 points) Find the conditional density of Y 1 given Y 2 = y 2 and P (Y 1 1.5|Y 2 =0.9). 2. Let X N (μ, σ 2 ). (a) (6 points) Show that the moment generating function of X is M X (t) = exp(μt + σ 2 t 2 /2). (b) (7 points) Show that if X 1 N (μ 1 , σ 2 1 ), X 2 N (μ 2 , σ 2 2 ), X 3 N (μ 3 , σ 2 3 ), and X 1 ,X 2 ,X 3 are independent, then X 1 + X 2 + X 3 N (μ 1 + μ 2 + μ 3 , σ 2 1 + σ 2 2 + σ 2 3 ). (c) (7 points) Suppose μ i and σ i , i =1, 2, 3 are known quantities. Based on X 1 ,X 2 ,X 3 , construct a statistic that has a Chi squared distribution with 3 degrees of freedom, a statistics that has a t distribution with 2 degrees of freedom, and a statistic that has an F distribution with 1 and 2 degrees of freedom. (d) (7 points) Suppose that μ i = μ and σ 2 i = σ 2 for i =1, 2, 3. Define S 2 = 1 2 P 3 i=1 (X i - ¯ X) 2 with ¯ X =(X 1 + X 2 + X 3 )/3. Show that ES 2 = σ 2 and ES σ. 3. Suppose (X 1 ,X 2 ,X 3 ) follows a multinomial distribution with m trials and cell probabilities p 1 , p 2 , ··· ,p 3 . Note that the Multinomial(m, p 1 ,p 2 ,p 3 ) probability mass function is m! x1!···x3! p x1 1 p x2 2 p x3 3 , x 1 + x 2 + x 3 = m, p 1 + p 2 + p 3 = 1. (a) (7 points) Find the marginal distribution of X 1 and the conditional distribution of X 2 given X 1 . Write down your reasoning. No mathematical proof is needed. (b) (7 points) Show that Cov(X 1 ,X 2 )= -mp 1 p 2 . (c) (7 points) Let 0 @ X 11 X 21 X 31 1 A , 0 @ X 12 X 22 X 32 1 A , ··· , 0 @ X 1n X 2n X 3n 1 A be iid random vectors from the above multino- mial distribution. Let ¯ X j = 1 n P n i=1 X ji ,j =1, 2. Find the limiting joint distribution of ( ¯ X 1 , ¯ X 2 ). State the theorem used. (d) (7 points) Find the limiting distribution of Y n = ¯ X 1 / ¯ X 2 . State the theorem used. 4. Suppose we have three cards. The first one is blank on both sides, the second has an X on one side and is blank on the other, and the third has an X on both sides. We run an ”experiment” where we choose one card at random and then look at one side of the chosen card at random.

description

probability practice problems

Transcript of Basic Probability Exam Packet

Page 1: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYJanuary 2013

Work all problems. Show all work. Explain your answers. State the theorems used when-

ever possible. 60 points are needed to pass at the Masters Level and 75 to pass at the Ph.D.

level.

1. Let Y1 and Y2 have the joint probability density function given by

f(y1, y2) = c(2� y1) for 0 y2 y1 2.

For the following parts, you can leave your answers in terms of integrals with explicit limits. No needto give the final numerical answers.(a) (6 points) Find c.(b) (6 points) Find the marginal density functions for Y1 and Y2.(c) (6 points) Are Y1 and Y2 independent? Why or why not?(d) (6 points) Find the conditional density of Y1 given Y2 = y2 and P (Y1 � 1.5|Y2 = 0.9).

2. Let X ⇠ N(µ,�

2).(a) (6 points) Show that the moment generating function of X is

M

X

(t) = exp(µt + �

2t

2/2).

(b) (7 points) Show that if X1 ⇠ N(µ1,�21), X2 ⇠ N(µ2,�

22), X3 ⇠ N(µ3,�

23), and X1, X2, X3 are

independent, then X1 + X2 + X3 ⇠ N(µ1 + µ2 + µ3,�21 + �

22 + �

23).

(c) (7 points) Suppose µ

i

and �

i

, i = 1, 2, 3 are known quantities. Based on X1, X2, X3, constructa statistic that has a Chi squared distribution with 3 degrees of freedom, a statistics that has at distribution with 2 degrees of freedom, and a statistic that has an F distribution with 1 and 2degrees of freedom.

(d) (7 points) Suppose that µ

i

= µ and �

2i

= �

2 for i = 1, 2, 3. Define S

2 = 12

P3i=1(Xi

� X)2 withX = (X1 + X2 + X3)/3. Show that ES

2 = �

2 and ES �.

3. Suppose (X1, X2, X3) follows a multinomial distribution with m trials and cell probabilities p1, p2, · · · , p3.Note that the Multinomial(m, p1, p2, p3) probability mass function is m!

x1!···x3!p

x11 p

x22 p

x33 , x1 + x2 + x3 =

m, p1 + p2 + p3 = 1.(a) (7 points) Find the marginal distribution of X1 and the conditional distribution of X2 given X1.

Write down your reasoning. No mathematical proof is needed.(b) (7 points) Show that Cov(X1, X2) = �mp1p2.

(c) (7 points) Let

0

@X11

X21

X31

1

A,

0

@X12

X22

X32

1

A, · · · ,

0

@X1n

X2n

X3n

1

A be iid random vectors from the above multino-

mial distribution. Let X

j

= 1n

Pn

i=1 X

ji

, j = 1, 2. Find the limiting joint distribution of (X1, X2).State the theorem used.

(d) (7 points) Find the limiting distribution of Y

n

= X1/X2. State the theorem used.

4. Suppose we have three cards. The first one is blank on both sides, the second has an X on one side andis blank on the other, and the third has an X on both sides. We run an ”experiment” where we chooseone card at random and then look at one side of the chosen card at random.

Page 2: Basic Probability Exam Packet

(a) (7 points) What is the probability that you see an X?(b) (7 points) What is the probability that you see an X and the other side of the chosen card has an

X too?(c) (7 points) Suppose we run the experiment above and we see an X. Given that outcome, what is

the probability that the other side of the card has an X on it too?

Page 2

Page 3: Basic Probability Exam Packet

UNIVERSITY OF MASSACHUSETTSDepartment of Mathematics and Statistics

Basic Exam - StatisticsWednesday, August 29, 2012

Work all problems. 60 points are needed to pass at the Masters Level and 75 to pass at thePh.D. level.

1. (15 points) Jane is trapped in a mine containing 3 doors. The first door leads to atunnel that will take her to safety after 3 hours of travel. The second door leads toa tunnel that will return her to her starting point in the mine after 5 hours of travel.The third door leads to a tunnel that will return her to her starting point in the mineafter 7 hours. If we assume that Jane is at all times equally likely to choose any oneof the doors, what is the expected length of time until she reaches safety?

2. Suppose we are in a situation where the value of a random variable X is observedand then, based on the observed value, an attempt is made to predict the value ofanother random variable Y . Let g(X) denote the predictor (i.e., if X is observed tobe equal to x, then g(x) is our prediction for the value of Y ). We would like to chooseg so that g(X) tends to be close to Y . Suppose we decide to choose g to minimizeE((Y � g(X))2). (That defines “best” below.) Assume that the means and variancesof X and Y , denoted by µ

X

= E(X), µ

Y

= E(Y ), �

2X

= V ar(X) and �

2Y

= V ar(Y ),and the correlation of X and Y , denoted by ⇢

XY

= Cov(X,Y )p�

2X�

2Y

, are known.

(a) (10 points) What is the best predictor of Y ? (show your calculation in detail)

(b) (5 points) Consider linear predictors of Y , i.e., g(X) = a+bX. In that case, whatis the best linear predictor of Y with respect to X? In other words, choose a andb in g(X) = a + bX as functions of the means and variances of X and Y and thecorrelation of X and Y .

3. The number of defects per yard in a certain fabric, Y, is known to have a Poissondistribution with parameter �. The probability mass function is:

Pr(Y = y|�) =exp(��)�y

y!, � > 0, y = 0, 1, 2, . . .

Suppose that � is also an exponential random variable with mean 1 (pdf: f(�) =exp(��), � � 0 and 0 otherwise.)

(a) (5 points) Write down and solve an integral expression for the unconditionalprobability mass function for Y.

(b) (5 points) Without using your answer from part a above, what is the unconditionalexpectation of Y ?

1

Page 4: Basic Probability Exam Packet

(c) (5 points) Without using your answer from part a above, what is the unconditionalvariance of Y ?

4. Suppose Y is a random variable with pdf g(y), X is a random variable with pdf f(x),and U has a U(0, 1) distribution. Further, let M > 1 be a constant where f(x) < Mg(x)for all x. Consider the following algorithm.

(i) Sample y from g(y) and u from a U(0, 1).

(ii) If u < f(y)/(Mg(y)) then ”accept” y and stop the algorithm. If not, go to (i).

(a) (5 points) Show that Pr(U < f(Y )/(Mg(Y ))) = E(f(Y )/(Mg(Y ))) where theexpectation is with respect to Y.

(b) (5 points) Show that the probability that the algorithm accepts y on the first tryis 1/M.

(c) (5 points) What is the expected number of tries the algorithm will make until itaccepts for the first time?

(d) (5 points) Derive the density of the accepted y.

5. Central Limit Theorem and related topics

(a) (10 points) State carefully a Central Limit Theorem for a sequence of i.i.d. randomvariables.

(b) (10 points) Suppose X

i

, i = 1, . . . , 100 are i.i.d. Poisson(0.0001). What is thestandard error of the sample mean?

(c) (5 points) Let Y =P100

i=1 X

i

. Use the Central Limit Theorem to write an expressionto approximate Pr(Y � 1). (You do not need a number.)

(d) (5 points) Is the answer to part c close to zero or close to one? Why?

(e) (5 points) The Poisson(�) moment generating function is exp(�(exp(t) � 1)).Use that result to derive an answer part c in another way. Does that support ordisagree with your answer to part d?

2

Page 5: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYWINTER 2012

Work all problems. Show your work. Explain your answers. State the theorems

used whenever possible. 60 points are needed to pass at the Masters level and 75 to

pass at the Ph.D. level

1. (20 points) There is more information in the joint distribution of two random

variables than can be discerned by looking only at their marginal distributions.

Consider two random variables, X1 and X2, each distributed binomial(1, ⇡),

where 0 < ⇡ < 1. Let Qab

= P{X1 = a, X2 = b}.

(a) In general, show that 0 Q11 ⇡. In particular, evaluate Q11 in three

cases: where X1 and X2 are independent, where X2 = X1, and where

X2 = 1�X1.

(b) For each case in (a), evaluate Q00.

(c) If P{X2 = 1|X1 = 0} = ↵ and P{X2 = 0|X1 = 1} = �, then express ⇡,

Q00, and Q11 in terms of ↵ and �.

(d) In part (c), find the correlation between X1 and X2 in terms of ↵ and �.

2. (20 points) Let Y1 and Y2 have the joint probability density function:

f(y1, y2) = k(1� y2), 0 y1 y2 1

= 0, otherwise.

(a) Find k.

(b) Find the marginal density functions for Y1 and Y2.

(c) Are Y1 and Y2 independent? Why or why not?

(d) Find the conditional density function of Y2 given Y1 = y1.

(e) Find Pr(Y2 � 3/4|Y1 = 1/2).

3. (20 points) Suppose that the random variable Y has a Poisson distribution with

mean �. The probability mass function is

f(y|�) =

e���y

y!

, for � > 0, y = 0, 1, 2, . . .

(a) Prove that ex

=

P1k=0

x

k

k! .

(b) Find the moment generating function of Y.

(c) Suppose that Y1 and Y2 are independent Poisson random variables with

means �1 and �2 respectively.

1

Page 6: Basic Probability Exam Packet

i. Derive the distribution of Z = Y1 + Y2.

ii. Derive the distribution of Y1|Z = k.

4. (20 points) Suppose Xi

, i = 1, . . . , n are independent and each has mean µ and

variance �2 <1. Let Zi

= Xi

� µ.

(a) Let Sn

= Z1 + . . . + Zn

. Prove that lim

n!1 Pr(|Sn

/n| > 0) = 0.

(b) Find f(n, �), a function of n and �, so that Zn

= f(n, �)Sn

converges in

distribution to a standard normal distribution as n!1.

(c) Approximately what is lim

n!1 Pr(|Zn

| > 1.645)?

5. (20 points) Suppose we flip coins. Let the random variable Xi

= 1 if the ithflip is a head and 0 otherwise. Assume that the X

i

s are independent Bernoulli

random variables with Pr(Xi

= 1) = ⇡. Let N be the number of flips required

to get the first head (N = 1, 2, . . .).

(a) What is E(N |X1 = i), i = 0, 1?

(b) Use the result from part (a) and the law of iterated expectations to derive

E(N).

(c) What is the the probability mass function of N?

(d) Let M = N�k, where k > 0 is a constant integer. Derive Pr(M > m|N >k).

(e) What is the probability mass function of M?

2

Page 7: Basic Probability Exam Packet

UNIVERSITY OF MASSACHUSETTSDepartment of Mathematics and Statistics

Basic Exam - ProbabilityFriday, September 2, 2011

Work all problems. 60 points are needed to pass at the Masters Level and 75 to pass atthe Ph.D. level.

1. (20 PTS) Suppose that Y is uniformly distributed on the interval (0,1).

(a) Find the moment generating function for Y .

(b) If a is a positive constant, derive the moment generating function for W = aY .What is the distribution of W? Why?

(c) If a is a positive constant and b is a fixed constant, derive the moment generatingfunction of V = aY + b. What is the distribution of V ? Why?

2. (20 PTS) Let X1 and X2 be independent variables each having an exponential distri-bution with mean 1. Let Y1 = X1/X2 and Y2 = X2.

(a) Without any calculation, give the marginal distribution of Y2.

(b) Find the joint probability density function of Y1 and Y2.

(c) Find the conditional distribution of Y1 given Y2 = y2.

(d) Find the marginal probability density function of Y1.

3. (20 PTS) A blood test is 99 percent e↵ective in detecting a certain disease when thedisease is present. However, the test also yields a false-positive result for 2 percentof healthy patients tested, who do not have the disease. Suppose 0.5 percent of thepopulation has the disease.

(a) Find the conditional probability that a randomly tested individual actually hasthe disease given that his or her test result is positive.

(b) Suppose instead that an individual is tested only if he or she has symptoms.Among those with symptoms, 50% are known to have the disease. Find theconditional probability that a person with symptoms actually has the diseasegiven that his or her test result is positive.

1

Page 8: Basic Probability Exam Packet

4. (25 PTS) (X1, X2) is a bivariate random variable and define ✓ = P (X1 > X2). Definethe function g(X1, X2) as follows:

g(X1, X2) =

(1 if X1 > X2

0 otherwise

(a) What is the expected value of g(X1, X2), E(g(X1, X2))?

(b) Let the pairs (X1i, X2i) be independent and identically distributed samples withthe same distribution as (X1, X2) where i = 1, . . . , n. Define Q =

Pni=1 g(X1i, X2i).

Find the distribution of Q.

(c) Show Q/n converges in probability to ✓ as n goes to infinity.

(d) Obtain the asymptotic distribution of n�1/2(Q� n✓).

5. (15 PTS) Consider a population consisting of 3 units with known sizes, s1 = 1, s2 =2, s3 = 3, from which we will select n = 2 units using a procedure known as proba-

bility proportional to size without replacement (PPSWOR) sampling. The PPSWORalgorithm proceeds as follows:

i. Select the first unit with probability proportional to size.

ii. Select the next unit with probability proportional to size from among the remain-ing un-sampled units.

iii. Repeat step (ii) until a sample of size n is obtained.

(a) What is the probability that the first unit selected is the unit of size 2 (s2 = 2)?

(b) What is the probability that the n=2 units selected are the units of sizes 1 and 2(in any order)?

(c) Give numerical expressions for the probabilities of sampling each possible pair ofunits (you need not simplify these expressions completely).

(d) Give numerical expressions for the probabilities of sampling each unit in the pop-ulation, that is, for ⇡1, ⇡2, and ⇡3, where ⇡i is the probability that unit i is selected(again you need not simplify these expressions completely).

(e) This sampling scheme is sometimes used to approximate a probability proportional

to size (PPS) sample. In a PPS sample, each unit is sampled with probabilityproportional to its size. That is:

⇡i

⇡j=

si

sj.

Simplify the expressions in the previous part (d) as necessary to show that thePPSWOR sampling algorithm does not result in a PPS sample.

2

Page 9: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICS UMASS- AMHERST BASIC EXAM - PROBABILITY WINTER 2011

Work all problems. 60 points are needed to pass at the Masters Level and 75 topass at the Ph.D. level. Each question is worth 20 points.

1. Let A be a bounded region in R2 and |A| be its area. The boundary of A isknown and, for any (x, y) 2 R2 it is easy to determine whether (x, y) 2 A.However, |A| is not known and cannot be calculated analytically. Thefollowing method is proposed to estimate |A|:

(a) Construct a rectangle B that contains A.

(b) Generate N (a large integer) points at random, uniformly, in B.

(c) Let X be the number of generated points that lie within A.

(d) Use X/N as an estimate of |A|/|B| and |B|⇥ (X/N) as an estimateof |A|.

The user can choose B, as long as it’s big enough to contain A. If thegoal is to estimate |A| as accurately as possible, what advice would yougive the user for choosing the size of B? Should |B| be large, small, orsomewhere in between? Justify your answer. Hint: think about Binomialdistributions.

1

Page 10: Basic Probability Exam Packet

2. You go to the bus stop to catch a bus. You know that buses arrive every15 minutes, but you don’t know when the next is due. Let T be the timeelapsed, in hours, since the previous bus. Adopt the prior distributionT ⇠ Unif(0, 1/4).

(a) Find E[T ].

Passengers, apart from yourself, arrive at the bus stop according to aPoisson process with rate � = 2 people per hour; i.e., in any interval oflength `, the number of arrivals has a Poisson distribution with parameter2` and, if two intervals are disjoint, then their numbers of arrivals areindependent. Let X be the number of passengers, other than yourself,waiting at the bus stop when you arrive.

(b) Suppose X = 1. Write an intuitive argument for whether that shouldincrease or decrease your expected value for T . I.e., is E[T |X = 1]greater than, less than, or the same as E[T ]?

(c) Find the density of T given X = 1, up to a constant of proportion-ality. It is a truncated version of a familiar density. What is thefamiliar density?

2

Page 11: Basic Probability Exam Packet

3. A discrete-time Markov chain is a series of indexed random variables,{X0, X1, X2, . . .} which displays the Markov property, namely

Pr(Xn+1 = j|X0 = x0, X1 = x1, X2 = x2, . . . Xn = i) = Pr(Xn+1 = j|Xn = i).

Consider such a Markov chain in which there are only finitely many pos-sible x’s and in which the so-called transition probabilities are given bythe matrix p such that

pij = Pr(Xn+1 = j|Xn = i),

constant for all n � 0.

(a) Give an expression in terms of p for the probability that Xn+2 = j

given Xn = i.

(b) Give an expression in terms of p for the probability that Xn+m = j

given Xn = i. For full credit, whenever possible, express your answerusing matrix notation rather than functions of the matrix elements.

(c) Prove that for any Markov chain, Pr(X3 = x3|X0 = x0, X1 = x1) =Pr(X3 = x3|X1 = x1).

3

Page 12: Basic Probability Exam Packet

4. Suppose X1 and X2 are random variables with joint density functionf(x1, x2) = c when x1 + x2 1 and both x1 and x2 are non-negative.The density f(x1, x2) = 0 otherwise. Except for part (a), purely graphicalsolutions will not get full credit.

(a) Draw a picture to show the x1 and x2 values where the density isnon-zero.

(b) What is c?

(c) What is the probability that X1 > X2?

(d) Are X1 and X2 independent? Why or why not?

(e) What is the density of Y = 1/X1?

4

Page 13: Basic Probability Exam Packet

5. Suppose X1 and X2 are independent and identically distributed randomvariables with density f(x) = � exp(��x), x � 0, and f(x) = 0 otherwise.

(a) The moment generating function of a random variable X is MX(t) =E[etX ]. Find the moment generating function of X1.

(b) Use the moment generating function to show that Y = X1 + X2 hasdensity f(y) = �

2y exp(��y), y � 0, and f(y) = 0 otherwise.

(c) Suppose � = 1. Let c > 0. Show that the density of X1|X1 > c isexp(�x)/ {1� exp(�c)} , x > c and 0 otherwise.

(d) Suppose � = 1. Let c > 0. Find the E(X1|X1 > c).

5

Page 14: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYFALL 2010

Work all problems. 60 points are needed to pass at the Masters Level and 75 to pass atthe Ph.D. level. Each question is worth 20 points.

1. Suppose you are told to toss a die until you have observed each of the six faces.

(a) Let Y1 be the trial on which the first face is tossed, Y2 be the number of additionaltosses required to get a face di!erent than the first, Y3 be the number of additionaltosses required to get a face di!erent than the first two distinct faces, . . ., and Y6

be the number of additional tosses required to get the last remaining face afterall other faces have been observed. Find the distribution of each Yi, i = 1, · · · , 6.

(b) What is the expected number of tosses required in order to observe each of thesix faces?

2. (a) Suppose X ! N(0, 1). Find the p.d.f. of Y = X2.

(b) Let X1 and X2 be two independent random variables; X1 has an exponentialdistribution with mean 1, and X2 has an exponential distribution with mean 2.Find the p.d.f. of Y = 2X1 + X2.

(c) Let X1 and X2 be two independent exponentially distributed random variables,each with mean 1. Find P (X1 > X2|X1 < 2X2).

3. Let Z be a standard normal random variable and let Y1 = Z and Y2 = Z2.

(a) Find E(Y1), E(Y2), and E(Y1Y2).

(b) Find Cov(Y1, Y2). Are Y1 and Y2 independent?

4. Suppose that X1, · · ·, Xk are iid N(µ, !2), k " 2. Denote:

U1 =k!

i=1

Xi, Uj = X1 # Xj for j = 2, · · · , k

(a) Show that U = (U1, · · · , Uk) has a k-dimensional normal distribution;

(b) Show that U1 and (U2, · · · , Uk) are independent;

(c) Express S2 as a function of U2, · · · , Uk alone. Hence, show that X and S2

are independently distributed. (Hint: You may use the fact that

"

k2

#

S2 =$

1!i<j!k1

2(Xi # Xj)2).

1

Page 15: Basic Probability Exam Packet

5. (a) Let {!n, n ! 1} be a sequence of independently identically distributed randomvariables with E(!1) = µ, V ar(!1) = "2 < ", and P (!1 = 0) = 0. Prove that

!1 + !2 + · · ·+ !n

!21 + !2

2 + · · ·+ !2n

µ2 + "2, n # ",

in probability. (Hint: you may use the theorem that says if Xn converges to Xin probability and Yn converges to Y in probability, and if f is continuous, thenf(Xn, Yn) converges to f(X, Y ) in probability. If further X = a and Y = b areconstants, then f only needs to be continuous at (a, b).).

(b) Let

Xn =

!

n with probability 1/n0 with probability 1 $ 1/n.

Show that Xn converges in probability to zero, but E(Xn) and V ar(Xn) do notconverge to zero.

2

Page 16: Basic Probability Exam Packet

Department of Mathematics and Statistics

Basic Probability Exam

January 2010

Work all problems. Show your work; explain your answers; state theorems used whenever possible.

1. A gene has two possible forms (alleles): A and a. Thus there are three possible genotypes: AA, aA, and aa.Number them 1, 2, and 3, respectively. Assume that their proportions in the population are p

2, 2pq, and q

2,respectively (q = 1� p).

For a family with a father, a mother, and one child, let the random variables F , M , and C denote the genotypesof the father, mother, and child, respectively. For example, F is either 1, 2, or 3, according to the genotypeof the father. Assume that F is independent of M , i.e. that the population mates randomly, and that theconditional distribution of C given (F,M) is determined by the familiar rules of genetics. (Children inheritone gene from each parent; each parent’s gene has probability 0.5 of being chosen; the mother’s contributionis independent of the father’s contribution.) Let p

ik

= Pr[C = k | M = i], the conditional probability that thechild is of type k given that the mother (or father) is of type i. Compute the nine probabilities p

ik

in terms ofp and q.

2. Let ~

Y have a trivariate Gaussian distribution with mean vector ~µ and covariance matrix ⌃, where

~

Y =

0

@Y1

Y2

Y3

1

A, ~µ =

0

@1�12

1

A, ⌃ =

0

@1 ⇢ 0⇢ 1 ⇢

0 ⇢ 1

1

A.

(a) For which values of ⇢ are Y1 + Y2 + Y3 and Y1 � Y2 � Y3 statistically independent?

(b) What is the distribution of Y1 + Y2 + Y3, including its name and associated parameters.

3. Suppose that X is a random variable with density (3x + 1)/8 on the interval (0, 2). Let Y be the area of acircle of radius X. Find the density of Y .

4. (a) A continuous random variable Y takes values on the interval (0,1). Show E[Y ] =R10 Pr[Y � y] dy. Hint:

you may use the fact that y =R

y

0 dz.

(b) A discrete random variable X takes values on the positive integers 1, 2, . . . . Show E[X] =P1

x=1 Pr[X � x].

5. A family of densities is called a univariate natural exponential family if, for some function A(✓), the density ofX given ✓ can be expressed as

p(x | ✓) = h(x)e✓x�A(✓).

Suppose that X has such a density.

(a) Show that the moment generating function M

X | ✓

(t) = E[etX ] is e

[A(✓+t)�A(✓)].

(b) Show that E[X] = A

0(✓).

1

Page 17: Basic Probability Exam Packet

Department of Mathematics and Statistics

Basic Probability Exam

August 2009

Work all problems. Show your work. Explain your answers. State

the theorems used whenever possible. 60 points are needed to pass

at the Masters level and 75 to pass at the Ph.D. level

1. An urn contains nine chips, five red and four white. Three are drawn outat random without replacement. let X denote the number of red chipsin the sample. Let Y denote the payment in dollars received by player,depending on X such that Y = (�2)X .

(a) (6 pt) Find the distribution of X.(b) (4 pt) Compute the expected payment of a player, i.e., that of Y .

2. Consider a random variable Y with probability density function (pdf)given by

f(y) = ce�y2/2, �1 < y <1

(a) (5 pt) Find c.(b) (5 pt) Derive the moment-generating-function of Y .(c) (5 pt) Find the expected value and variance of Y .(d) (10 pt) What is the pdf of Y 2?

3. Let X1 and X2 be independent standard normal random variables. Let Ube independent of X1 and X2, and assume that U is uniformly distributedover (0,1). Define Z = UX1 + (1� U)X2.

(a) (5 pt) Find the conditional distribution of Z given U = u

(b) (5 pt) Find the expected value of Z, E(Z)(c) (15 pt) Find the variance of Z, V(Z)

4. (15 pt) A blood test is 99 percent e↵ective in detecting a certain diseasewhen the disease is present. However, the test also yields a false-positiveresult for 2 percent of the healthy patients tested, who have no such dis-ease. Suppose 0.5 percent of the population has the disease. Find theconditional probability that a randomly tested individual actually has thedisease given that his or her test result is positive.

5. (a) (5 pt) State carefully the Central Limit Theorem for a sequence ofi.i.d. random variables.

(b) (5 pt) Suppose X1, . . . ,X100 ⇠ i.i.d. Unif(0, 1). What is the standarddeviation of X, the mean of X1, . . . ,X100?

(c) (15 pt) Use the Central Limit Theorem to find approximately theprobability that the average of the 100 numbers chosen exceeds 0.56.You may use the approximation 1/

p12 ⇡ 0.3.

1

Page 18: Basic Probability Exam Packet

Department of Mathematics and Statistics

Basic Probability Exam

January 2009

Work all problems. Show your work. Explain your answers. State

the theorems used whenever possible. 60 points are needed to pass

at the Masters level and 75 to pass at the Ph.D. level

1. Ecologists are studying salamanders in a forest. There are two types offorest. Type A is conducive to salamanders while type B is not. They arestudying one forest but don’t know which type it is. Types A and B areequally likely.During the study, they randomly sample quadrats. (A quadrat is a square-meter plot.) In each quadrat they count the number of salamanders. Somequadrats have poor salamander habitat. In those quadrats the numberof salamanders is 0. Other quadrats have good salamander habitat. Inthose quadrats the number of salamanders is either 0, 1, 2, or 3, withprobabilities 0.1, 0.3, 0.4, and 0.2, respectively. (Yes, there might be nosalamanders in a quadrat with good habitat.) In a type A forest, theprobability that a quadrat is good is 0.8 and the probability that it ispoor is 0.2. In a type B forest the probability that a quadrat is good is0.3 and the probability that it is poor is 0.7.

(a) 4 pts On average, what is the probability that a quadrat is good?(b) 5 pts On average, what is the probability that a quadrat has 0

salamanders, 1 salamander, 2 salamanders, 3 salamanders?(c) 4 pts The ecologists sample the first quadrat. It has 0 salamanders.

What is the probability that the quadrat is good?(d) 4 pts Given that the quadrat had 0 salamanders, what is the prob-

ability that the forest is type A?(e) 4 pts Now the ecologists prepare to sample the second quadrat.

Given the results from the first quadrat, what is the probability thatthe second quadrat is good?

(f) 4 pts Given the results from the first quadrat, what is the probabilitythat they find no salamanders in the second quadrat?

1

Page 19: Basic Probability Exam Packet

2. A Poisson random variable with mean µ has the following p.d.f:

f(x) =e

�µ

µ

x

x!, for x = 0, 1, 2, . . .

(a) 11 pts Let X be Poisson with mean µ. Compute the moment gen-erating function of X. It is known that:

e

y =1X

k=0

y

k

k!.

(b) 7 pts If X1, . . . ,Xn

are independent Poisson variables with meansµ1, . . . , µn

, find the moment generating function of

Y =nX

k=1

X

k

.

(c) 7 pts What is the distribution of Y ?

2

Page 20: Basic Probability Exam Packet

3. Let X ⇠ N(µ, �

2).

(a) 12 pts Show that the moment generating function of X is:

M

X

(t) = exp(µt + �

2t

2/2)

(b) 8 pts Show that if X1 ⇠ N(µ1, �21) and X2 ⇠ N(µ2, �

22) and X1 and

X2 are independent, then X1 + X2 ⇠ N(µ1 + µ2, �21 + �

22).

3

Page 21: Basic Probability Exam Packet

4. (a) Let (X1, X2) be distributed uniformly on the disk where X

21 +X

22 1.

Let R =p

X

21 + X

22 and ⇥ = arctan(X1/X2). Hint: it may help to

draw a picture.i. 1 pt What is the joint density p(x1, x2)?ii. 4 pts Are X1 and X2 independent? Explain.iii. 15 pts Find the joint density p(r, ✓).iv. 4 pts Are R and ⇥ independent? Explain.

(b) Let (X1, X2) be distributed uniformly on the square whose cornersare (1, 1), (�1, 1), (�1,�1), and (1,�1). Let R =

pX

21 + X

22 and

⇥ = arctan(X1/X2).i. 1 pt What is the joint density p(x1, x2)?ii. 1 pt Are X1 and X2 independent? Explain.iii. 4 pts Are R and ⇥ independent? Explain.

4

Page 22: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYAugust 2008

Work all problems. Show all work. Explain your answers. State the

theorems used whenever possible. 60 points are needed to pass at the

Masters Level and 75 to pass at the Ph.D. level.

1. Let X have the double exponential distribution with density f(x) = 12 exp(!|x|)

for !" < x < ".

(a) (6pts) Find the moment generating function (MGF) of X.

(b) (6pts) Compute the mean and variance of X using the MGF obtained in(a).

(c) (6pts) Let X1, · · · , X100 be independent random variables, all of whichhave the double exponential distribution. Let X = 1

100

!100i=1 Xi. Find !

such that P (|X| < !) # 0.95.

(d) (6pts) Find the distribution of Y = |X|.

2. Let X1 and X2 be independent, identically distributed random variables thathave the common probability density function: f(x) = exp(!x), x > 0. LetY1 = X1 + X2 and Y2 = X1 ! X2.

(a) (6pts) Find the joint pdf of Y1 and Y2.

(b) (6pts) Find the marginal pdf of Y1.

(c) (6pts) Find the conditional pdf of Y2, given that Y1 = y1.

3. The hourly number of phone calls, X, received by a switchboard at a spe-cific company is Poisson with parameter ". " varies independently fromhour to hour according to the following exponential distribution: f(") =# exp(!#"), " > 0. Answers to the following questions may depend on #.

(a) (7pts) Find an integral expression for P (X = 4|" $ 6). You do not needto evaluate the integral.

(b) (7pts) Find E(X|") and its distribution.

(c) (7pts) Use the preceding part to calculate E(X).

(d) (7pts) Assuming that the number of phone calls to the switchboard isindependent from hour to hour, how many hours would be expected tohave exactly 3 phone calls during a 24-hour period?

1

Page 23: Basic Probability Exam Packet

4. Let X1, · · · , Xn be iid with density f(x) = !(1 ! x)!!1, 0 < x < 1. DefineX(n) = max1"i"n Xi.

(a) (7pts) Find P (n"(1 ! X(n)) < x) for any fixed " and x " (0, 1).

(b) (9pts) State the definition of convergence in distribution and find a valueof " so that n"(1 ! X(n)) converges in distribution.

(c) (7pts) Let Xn = n!1!n

i=1 Xi. Show that#

n(Xn ! 1/(1 + !)) convergesin distribution and say what it converges to.

(d) (7pts) Let Tn = X2n. Find an approximate distribution of Tn as n goes to

infinity.

2

Page 24: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYWINTER 2008

Work all problems. 60 points are needed to pass at the Masters Level and 75 to pass at

the Ph.D. level.

1. Suppose an individual can be either diseased (D) or healthy (H). A med-ical diagnostic test can say that an individual is diseased (+) or healthy(-). In a particular population,

• the probability an individual is D and the test is + is 0.009,• the probability an individual is D and the test is - is 0.001,• the probability an individual is H and the test is + is 0.099,• and the probability an individual is H and test is - is 0.891.

(a) (10 pts) Suppose an individual is randomly selected from that par-ticular population. What is the probability that she is healthy?

(b) (5 pts) Given that an individual is diseased, what the probabilitythat the test is +?

(c) (5 pts) Suppose an individual gets the test, and the test is positive.What is the probability that the individual is diseased?

2. Suppose X ⇠ U(0, 1). Note that f(x) = 1 when 0 x 1 and f(x) = 0

otherwise.

(a) (5 pts) What is the probability thatp

X is less than 1/2?(b) (10 pts) What is the second moment of 1/X?(c) (10 pts) Suppose that G(y) is the cumulative distribution function

(CDF) of a continuous random variable with probability distributionfunction (PDF) g(y). Note that G(y) =

R y�1 g(t)dt. Let G

�1(p) be the

inverse of the CDF. Let Z = G

�1(X). Derive the PDF of Z.

3. Suppose Y has PDF f(y) = c exp (�y/2) when y > 0 and f(y) = 0 other-wise.

(a) (5 pts) What is c?(b) (5 pts) Derive the moment generating function of Y.

(c) (5 pts) Use the moment generating function to find E(Y

k) for k =

1, 2, 3.

(d) (5 pts) Prove that E(Y

k) � E(Y )

k for k > 1. It is OK to cite a theo-rem.

(e) (5 pts) What is the PDF of 3Y ?

1

Page 25: Basic Probability Exam Packet

(f) (5 pts) Let X = Y 1Y >3�3 where 1Y >3 = 1 if Y > 3 and 0 otherwise.Derive E(X).

4. Suppose Xi, i = 1, . . . are independent and identically distributed withmean µ and variance �

2<1. Let Zi = (Xi � µ)/�.

(a) (5 pts) Let Mn = n

�1Pn

i=1 Zi. Prove that limn!1 Pr(|Mn| > 0) = 0.

For partial credit, you may just state a theorem.(b) (10 pts) Let f(n) be a function of n. Find an f(n) so that the variance

of f(n)Mn = 1.

(c) (10 pts) Let A be a constant. State and apply a theorem that willallow you to determine limn!1 Pr(f(n)Mn < A).

2

Page 26: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYFALL 2007

Work all problems. 60 points are needed to pass at the Masters Level and 75 to pass at

the Ph.D. level.

1. Suppose X

i

i.i.d.⇠ U(0, 10), i = 1, . . . , n. Note that f(x) = 1/10, 0 x 10,

E(X) = 5 and V ar(X) = 100/12.

(a) (5 pts) Write down an expression for the probability that all X

i

s are

greater than 1.

(b) (10 pts) Let X = n

�1P

n

i=1 X

i

. Find an expression that involves X

and known constants that converges to a N(0,1) distribution as n

gets large. What theorem is your result based on?

(c) (10 pts) Find the mean of 1/X

i

.

2. Let X and Y be random variables with pdf:

f

X,Y

(x, y) = 1, 0 x 1, x y x + 1, and f

X,Y

(x, y) = 0 otherwise.

(a) (5 pts) Show that f(x, y) is a density.

(b) (5 pts) Are X and Y independent? Why or why not?

(c) (5 pts) Find f

X

(x).

(d) (5 pts) Find E(Y |X = x).

(e) (5 pts) Find Pr(X + Y < 0.5)

3. Suppose X

ind.⇠ Bin(n, p) and Y

ind.⇠ Bin(m, p). Note that the Bin(k, q) prob-

ability mass function is

�k

x

�q

x

(1� q)

k�x

, x = 0, . . . , k, 0 q 1.

(a) (15 pts) Find the conditional distribution of X given that X +Y = j.

Give the probability mass function of this conditional distribution

and identify it by its family name and parameters.

(b) (5 pts) What is Pr(X > Y )?

(c) (5 pts) What is Pr(X/Y = 1)?

4. Suppose X|Y = y ⇠ Poisson(y) and Y ⇠ Unif(0, 1). (The Poisson(�)

pmf is f(x) = exp(��)�

x

/x! when � > 0 and x = 0, 1, 2, . . . and zero

otherwise.)

(a) (5 pts) What is the mean of X?

(b) (10 pts) What is the marginal distribution of X?

(c) (10 pts) What are E(XY ) and Var(XY )?

1

Page 27: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYJanuary, 2007

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the

Ph.D. level.

1. (25 points) Suppose that X has density function f(x) = b(1� x

2), |x| < c, and f(x) is

zero otherwise.

(a) Find constants b and c so that f(x) is a density function. (Note that there is more

than one right answer.)

(b) Derive the first and second moments of X.

(c) Derive the conditional density of X given that X is greater than 0.

2. (15 points) Suppose we have three cards. The first one is blank on both sides, the

second has an X on one side and is blank on the other, and the third has an X on both

sides. We run an ”experiment” where we choose one card at random and then look at

one side of the chosen card at random.

(a) What is the probability that you see an X?

(b) What is the probability that you see an X and the other side has an X too?

(c) Suppose we run the experiment above and we see an X. Given that outcome, what

is the probability that the other side of the card has an X on it too?

3. (20 points) Let X be a random variable with E(X) = µ and Pr(X = µ) < 1.

(a) Is E {exp(X)} equal to, less than, or greater than exp(EX) = exp(µ) in general?

Why?

(b) Now, suppose X ⇠ N(0, 1) with density

1p2⇡

exp(�x

2/2).

i. What is the moment generating function of X?

ii. Use the previous result (part i) to derive E {exp(X)} in this case.

4. (25 points) Let Y1, . . . , Yn be a random sample from a Poisson distribution with rate �

and Pr(Yi = k) = exp(��)�

k/k!, k = 0, 1, . . . . Let Y n = n

�1 Pni=1 Yi.

(a) State the central limit theorem in general, and then use it to argue that n

1/2(Y n�

�)/

p� converges in distribution to a standard normal.

(b) Let Zi = 1 if Yi > 0 and Zi = 0 otherwise. Let

b↵n = n

�1 Pni=1 Zi. Find a µ and �

so that n

1/2(

b↵n � µ)/� converges in distribution to a standard normal.

(c) Given that Y n converges to � in probability as n goes to infinity, define a func-

tion of Y1, . . . , Yn that converges in probability to � in the previous question as n

goes to infinity. Explain your answer, and name the results that you use to show

convergence in probability.

5. (15 points) Let X have an exponential distribution with CDF 1�exp(�x/�), x � 0, � >

0.

Page 28: Basic Probability Exam Packet

(a) What is the pdf of X?

(b) Suppose that Y is independent of X and has the same distribution. Show that the

distribution of Z = X + Y is gamma(2, �) with density f(z) =

1�(2)�2 z exp(�z/�).

Page 29: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUMASS - AMHERST

BASIC EXAM - PROBABILITYFALL 2006

Work all problems. 60 points are needed to pass at the Master’s level and 75 to

pass at the Ph.D. level.

1. The logistic function is

p = f(x) = 1

1+exp(�x)

,�1 < x <1, 0 < p < 1.

(a) (10 pts) Suppose X ⇠ N(0,�

2) with pdf 1

p2⇡

exp⇣

x

2

�2�

2

⌘. What is

the distribution of P = f(X)?

(b) (10 pts) Find a distribution for X so that P = f(X) has a Unif(0,1)distribution. For full credit, you should prove that your choice workstoo.

2. Let Y

1

, . . . , Y

n

be a random sample from some distribution with Pr(Yi

=1) = ✓, 0 < ✓ < 1. Let b

n

= n

�1

Pn

i=1

Y

i

.

(a) (15 pts) State the central limit theorem in general, and then use itto argue that n

1/2(b✓n

� ✓)/⌧ converges in distribution to a standardnormal. Define ⌧ as a function of ✓.

(b) (15 pts) Define a b⌧n

that converges in probability to ⌧ as n goes toinfinity. Explain your answer, and name the results that you use toshow convergence in probability.

3. Let X = 1 with probability p and X = 0 with probability 1 � p. LetY be another random variable that can also be either zero or one. LetPr(Y = 1|X = 1) = r and Pr(Y = 1|X = 0) = s.

(a) (10 pts) Find Pr(Y = 1) and E(Y ).

(b) (10 pts) Find V ar(Y ).

(c) (5 pts) What is the distribution of Z = X/(Y + 1)?

4. Let X be a random variable with E(X) = � and Pr(X = �) < 1.

(a) (10 pts) Does E

�X

0.5

�= (EX)0.5 = �

0.5? Why or why not? If not,give an inequality.

(b) (15 pts) Suppose X ⇠ Exp(�) with pdf f(x;�) = 1

exp��x

�, x >

0,� > 0. What is E

�X

1.5

�? Hint: It may help to recall that the

gamma distribution is g(x,↵, �) = x

↵�1exp(�x/�)

�(↵)�

↵ .

1

Page 30: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYAugust 31, 2005

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the

Ph.D. level.

1. (25 points) Suppose that X and Y have joint density function f(x, y) = c(x+y)I(0,1)(x)I(0,1)(y).

(a) What must the constant c be in order for f(x, y) to be a density function?

(b) What are the mean and variance of X?

(c) Find the conditional distribution of Y given X = x.

2. (15 points) Suppose the number of customers Y entering a bank in a one hour periodis distributed Poisson with mean 10 (pmf: e

�1010y

/y!). Suppose that given Y = y thetotal time T needed to service the y customers has an exponential distribution withmean 3y (pdf: e

�t/(3y)/(3y)). Find the unconditional mean and variance of T .

3. (15 points) Let X1 and X2 be iid with Uniform(-1,1) distributions.

(a) Find the pdf of Y = X

21 .

(b) Let Z = X1X2. Are Y and Z independent? Why or why not?

4. (20 points) Two players, A and B, are playing a game. The game consists of a seriesof trials, and the first player to win two more trials than the other will win the game.Suppose that each trial is iid, and the probabilities that either player wins a trial are:

Pr(A wins a trial) = p

Pr(B wins a trial) = q = 1� p.

(a) What is the probability that A wins in exactly 6 trials?

(b) What is the probability that A wins in exactly 2n trials?

(c) What is the probability that A wins?

5. (25 points) Let X

t

be the number of fish that a particular fishing boat catches on tript. Suppose that the number of fish caught is independent from trip to trip, the meannumber of fish caught on any particular trip is 49, and the variance is also 49.

(a) Over the course of the season of 100 trips, what is the approximate probability thatthe mean number of fish that are actually caught is no less than 48? (You mayleave your answer as a formula.)

(b) Let Y

t

be the profit from trip t. Due to market forces suppose that Y

t

= log(Xt

).Is the mean of Y

t

greater than, less than, or equal to log(49) or do you need moreinformation? (and why?)

(c) Next, make the additional assumption that the number of fish caught on a particulartrip has a Poisson distribution. What is the probability that no fish are caught onat least one out of 250 trips? (You may leave your answer as a formula.)

Page 31: Basic Probability Exam Packet

(d) A slightly more sophisticated model posits that on overcast days the expectednumber of fish caught are iid with mean µ

o

and standard deviation �

o

. On sunnydays the number caught are iid with mean and standard deviation µ

s

and �

s

.Suppose that the probability of an overcast day is p. What are the marginal meanand variance of the number of fish caught under this model?

Page 32: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM: PROBABILITYJANUARY 2005

Work all problems. Sixty points are needed to pass at the Master’s level and

seventy-five at the Ph.D. level

1. (20 points) A Poisson random variable with mean µ has pmf:

f(x) =

exp(�µ)µ

x

x!

, x = 0, 1, 2, . . .

(a) Let X be Poisson with mean µ. Compute the moment generatingfunction of X. It may help to remember that:

exp(y) =

1X

k=0

y

k

k!

.

(b) Let X1, X2 be independent Poisson variables with means µ1, µ2, andlet a1, a2 be positive constants. What is the moment generating func-tion of Y =

P2i=1 a

i

X

i

?(c) What is the distribution of Y ?

2. (20 points) Let X and Y have the joint density function f(x, y) = c, 0 x y 1.

(a) Find c.(b) What is the marginal pdf of X?(c) Are X and Y independant? Why or why not.

3. (20 points) A weed is exposed to a known dose of weed killer (X). Theweed either survives (Y = 1) or dies (Y = 0). Suppose the weed hasan unobserved natural tolerance to the weed killer (denoted by Z), andassume that this tolerance has a standard normal distribution. Further,suppose that the weed survives if an only if Z > �X. Note that Z israndom and X is fixed.

(a) What is the probability that the weed survives?(b) What is the distribution of Z given that the weed is not killed?(c) Derive the moment generating function for Z given that Y = 1. You

may express your answer as an unsimplified integral that involvesthe standard normal pdf (�(·)), cdf (�(·)), and other functions.

(d) Use the result from the previous part to derive:

E(Z|Y = 1) =

�(�X)

1� �(�X)

=

�(X)

�(X)

.

1

Page 33: Basic Probability Exam Packet

4. (20 points) A game is played with n coins. Coins 1 through n � 1 are“fair” and land heads with probability 1/2. The nth coin has two heads;it always lands heads up. The game consists of drawing coins blindlyfrom the bag, flipping them, and replacing them back into the bag.

(a) Let T be the number of coins that must be drawn and flipped untilone sees a total of 3 tails. What is the mean of T?

(b) What is the probability that T strictly exceeds 6?(c) Suppose one coin is drawn from the bag, flipped, and it lands heads.

What is the probability that it is the unfair coin (the nth coin)?

5. (20 points) Joe walks to and from work each day. The commute to work,T

i

, has mean µ

T

and variance �

T

2. The commute from work, F

i

, hasmean µ

F

and variance �

F

2. Further, suppose T

i

and F

i

are mutually in-dependent. Let D

i

= T

i

� F

i

.

(a) What are the mean and variance of D

i

?(b) Let D100 be the mean difference over 100 days: D100 =

P100i=1 D

i

/100.

Write an approximation for the probability that D100 is negative.

2

Page 34: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM - PROBABILITYFRIDAY, SEPTEMBER 3, 2004

Work all problems. Sixty points are needed to pass at the Master’s level and seventy-five

at the Ph.D. level.

1. (20 points) Let X have a Poisson distribution.

(a) Give the probability mass function for X.

(b) Derive the moment generating function of X.

(c) Derive the mean and variance of X. (You can do this using b) if you want, butyou don’t have to.)

2. (20 points) Suppose a plant is manufacturing a product using three di↵erent machines1, 2 and 3 and large inventory has been built up which consists of 30% from 1, 20%from 2 and 50% from 3. Suppose 100 items are selected. With a large inventory wewill treat the selections as independent where on each draw the probability is .3, .2and .5 of getting an item from machine 1, 2 or 3, respectively.

(a) Derive the joint distribution of X1, X2, X3 where Xj = the number of itemsse-lected from machine j.

(b) Suppose that each item has a lifetime, and the distribution of lifetimes has mean5 and standard deviation 1 for machine 1; mean 6 and standard deviation .5 formachine 2; and mean 7 and standard deviation .8 for machine 3. Let T denotethe total lifetime of the 100 items selected (see part a). Find the expected valueand variance of T .

3. (20 points) Let X = (X1, X2, . . . Xn) be a vector valued random variable and let Ti =Ti(X), i = 1, 2, . . . d. Suppose that the probability density function of X, parameterizedby ⌘ = (⌘1, . . . , ⌘d) in an open interval in R

d, is given by

g⌘(x) = h(x) exp{dX

i=1

⌘iTi(x)�K(⌘)},

where it is assumed thatZ

Rdh(x) exp{

dX

i=1

⌘iTi(x)}dx < +1.

(a) Show that

K(⌘) = log⇣ Z

h(x) exp{dX

i=1

⌘iTi(x)}dx

(b) Show that

E[Ti] =@

@⌘iK(⌘),

1

Page 35: Basic Probability Exam Packet

(c) Show that

cov(Ti, Tj) =@2

@⌘i@⌘jK(⌘).

4. (25 points) Let X1 and X2 be independent exponential distributions with mean 1.Define Y1 = X1 + X2 and Y2 = X1.

(a) Derive the joint distribution (giving the joint density su�ces) of (Y1, Y2)

(b) Find the conditional distribution of Y1 given Y2 = y2.

(c) Give an approximation to the variance of Y1/Y2.

5. (15 points) Let (X1, . . . Xn) and (Y1, . . . , Yn) be two di↵erent i.i.d. random sequenceswith

E[Xi] = µX , E[Yi] = µY , Var(Xi) = �X2 > 0, Var(Yi) = �Y 2 > 0.

We denoteX = n

�1(X1 + . . . + Xn), Y = n

�1(Y1 + . . . + Yn).

Identify and justify the limiting distribution of

pn(X � µX) + Y

as (n! +1).

You can appeal to well known results but state clearly what results you are using andhow they apply here.

2

Page 36: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYAugust 27, 2003

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the Ph.D.

level.

1. (20 pts) Suppose 3 boys and 3 girls stand in a line in random order.

(a) What is the probability that the subsequence of boys is ascending in height and the

subsequence of girls is ascending in height.

(b) What is the probability that the sequence alternates between boys and girls?

(c) What is the probability that the three boys are not together?

2. (20 pts) Let X ⇠ N(µ, �2).

(a) Show the MGF of X is MX(t) = eµt+�2t2/2.

(b) Show that if X1 ⇠ N(µ1, �21) and X2 ⇠ N(µ2, �2

2) with X1 and X2 independent then

X1 + X2 ⇠ N(µ1 + µ2, �21 + �2

2)

3. (20 points) Suppose each of 100 genes has probability 0.2 of mutating in a given time period,

and the genes act independently. Let N = the number of the genes which mutate.

(a) Derive the exact distribution of N .

(b) State the CLT (central limit theorem) and use it to approximate P (N < 10).

4. (20 pts) The daily number of visits to a particular website, X, is Poisson with parameter ⇤.

⇤ varies independently from day to day according to an exponential distribution:

f(�) = 1(0,1)(�)ce�c�

(a) Find an integral expression for P (X = 3, ⇤ 5). Don’t evaluate the integral.

(b) What is the distribution of the random variable E(X|⇤)

(c) Use the preceding part to compute E(X).

(d) Assuming the number of visits to the website is independent from day to day, how many

days would be expected to have exactly 2 visits over the course of a year?

5. (20 pts) Let X1 and X2 be independent random variables each having an exponential distri-

bution with mean 1. Let

Y1 = X1/X2

Y2 = X2

(a) Without any calculation give the marginal distribution of Y2 and the conditional distribu-

tion of Y1 given Y2 = y2.

(b) Find the joint pdf of Y1 and Y2.

(c) Find the marginal pdf of Y1.

Page 37: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYAugust 28, 2002

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the Ph.D.

level.

1. (20 pts) Choose a point P in the plane by letting the “x” and “y” coordinates, P

x

and P

y

, be

independent N(0, 1) random variables.

(a) What is the joint density function, f(x, y), of P

x

and P

y

?

(b) Create the point Q by rotating P clockwise by ✓ radians around the origin. Express the

coordinates of Q, Q

x

, Q

y

as a function of the coordinates of P and find the joint density

function for Q

x

and Q

y

.

(c) Are Q

x

and Q

y

independent?

2. (20 pts) The number of eggs Y laid by an insect has a Poisson distribution with expected value

E(Y ) = �. Given the insect lays Y = y eggs, the number of those surviving, X, has a Binomial

distribution with sample size y and probability p.

(a) Compute the overall expected number of eggs that will survive. You must justify your

answer.

(b) Show that V (X) = E(X).

(c) Suppose that the average number � of eggs laid by an insect is a function of the age �.

In particular, assume that � has an exponential distribution with parameter � = E(�).

Compute the overall expected number of eggs that will survive as a function of �. You

must justify your answer.

3. (20 pts) A rat is exposed to a known dose of X units of poison and either survives (Y = 1)

or dies (Y = 0). Suppose the rat also has an unobserved natural tolerance to the poison (Z),

and assume that this tolerance has a standard normal distribution. Further, suppose that the

rat survives if and only if Z > �X. Note that X is a fixed quantity, and Z is random.

(a) What is the probability that the rat survives?

(b) What is the distribution of Z given that Y = 1?

(c) Derive the moment generating function for Z given that Y = 1. You might want to express

you answer in terms of � or � where �(·) is the standard Gaussian PDF and �(·) is the

standard Gaussian CDF.

(d) Again, assume that the rat survives. Use the moment generating function derived in part

(c) to show that

E(Z|Y = 1) =

�(�X)

1� �(�X)

=

�(X)

�(X)

Page 38: Basic Probability Exam Packet

4. (25 pts) Suppose a person is at risk for two ways of dying, dying from cancer or dying from

a heart attack. The time until a heart attack is modeled with an exponential distribution

with mean µ

�1months, and the time until death from cancer is modeled with an exponential

distribution with mean �

�1months. Let the two times be independent. We will only observe

the time until death (the minimum of the two exponential random variables) and the cause of

death. The larger of the two random variables will not be observed.

(a) What is the distribution of the observed time until death?

(b) Find the probability that the person dies from cancer.

(c) Show that the probability that the person dies of cancer within k months is

µ+�

{1 �e

�k(µ+�)}.(d) Use parts (a), (b), and (c) to determine whether the time until death is independent of

the cause of death.

5. (15 pts) Let X1, X2, . . . be a sequence of independent random variables, with E(X

i

) = µ

i

and

V (X

i

) = �

2i

, and define

Y

n

=

Pn

i=1(Xi

� µ

i

)

(

Pn

i=1 �

2i

)

1/2.

It can be shown that, under suitable conditions,

lim

n!1P (Y

n

y) = �(y)

where �(x) denotes the cumulative distribution function of the standard normal distribution.

Suppose that a computer test can generate an infinite number of questions arranged in a

sequence from the easiest to the most di�cult. Suppose also that the probability that a

student will answer the ith question correctly is

p

i

=

1

i + 1

and that all questions will be answered independently. Approximate the probability that a

student will answer correctly at least 10 out of 100 questions.

Page 39: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYJanuary 25, 2002

Work all problems. 60 points are su�cient to pass at the Master’s level and 75 to pass at the Ph.D.

level.

1. (24 pts) Let Q be the unit square in the xy-plane and A a region in Q. Let ↵ be the area ofA. Choose n points independently and unifromly distributed over Q. Let

X

i

=

(1 if the ith point lies in A

0 otherwise

for i = 1, . . . , n.

(a) Find the expected value, E(Xi

), and the variance, V (Xi

) for 1 i n.

(b) Let X = 1n

Pn

i=1 X

i

. If n is large, X is, with high probability, a good approximation towhat number? Why?

(c) Use the CLT to determine how large n should be so that

P (|X � ↵| .01) = .99

when ↵ = .2.

(d) Use Chebychev’s inequality to give an upper bound for P (|X�↵| > .01) Does your answerrequire that n be “large?”

2. (18 pts) Let X have pdf

f(x) =

(e

�x

x > 00 otherwise

and let Y be the greatest integer less than or equal to X.

(a) Find the probability distribution of Y .

(b) Compute E(Y ).

(c) Compute the variance of Y .

3. (18 pts) Let X1 and X2 be random variables with joint pdf

f(x1, x2) =

(4x1x2 for 0 < x1 < 1 and 0 < x2 < 1

0 otherwise

Let

Y1 = X1/X2

Y2 = X1 ⇤X2

(a) Sketch the region S consisting of all points (y1, y2) such that f

Y1,Y2(y1, y2) > 0.

(b) Find the joint pdf of Y1 and Y2.

(c) Find the marginal pdf of Y1.

Page 40: Basic Probability Exam Packet

4. (12 pts) A Poisson random variable with mean µ has pdf

f(x) =e

�µ

µ

x

x!

for x = 0, 1, 2, . . .

(a) Let X be Possion with mean µ. Compute the moment generating function of X. It mayhelp to remember that

e

y =1X

k=0

y

k

k!

(b) If X1, . . . , Xn

are independent Poisson variables with means µ1, . . . , µn

, what is the momentgenerating function of Y =

Pn

k=1 X

k

?

(c) What is the distribution of Y ?

5. (18 pts) Let X be a standard normal random variable, and let Y be a random variable suchthat E(Y |X = x) = ax + b, for some known a and b, and V (Y |X = x) = 1.

(a) Show that E(Y ) = b.

(b) Show that V (Y ) = 1 + a

2.

(c) Show that E(XY ) = a.

6. (12 pts) Three molecules of type A, three of type B, three of type C, and three of type D are tobe linked together to form a chain molecule. One such chain molecule is ABCDABCDABCD.

(a) How many such chain molecules are there?

(b) Suppose all of the di↵erent molecule structures are equally likely. What is the probabilitythat all three molecules of each type end up next to each other (as in BBBAAADDDCCC)?

Page 41: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYAugust 30, 2001

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the Ph.D.

level.

1. (17 points) In a children’s game a six-sided die is rolled until all six faces have come up. Let

T be the number of rolls it takes for this to happen. For example, for the sequence:

11243665 . . .| {z }T=8

(a) Compute the expectation of T .

(b) Compute the variance of T .

2. (16 points) Let X have a Poisson distribution with parameter �,

P (X = x) =

e

��

x

x!

for x = 0, 1, . . ..

(a) Derive the moment generating function of X.

(b) Suppose X1, . . . , XN

are independent random variables with X

i

⇠ Poisson(�

i

) What is the

moment generating function of Y =

PN

i=1 X

i

?

(c) What is the distribution of Y ? You must justify your answer.

3. (17 points) An advertising agency sends out periodic mailings to two clients, American Buzz

Saw Inc., and Bailey’s Fine Fabrics. For each mailing, the president of the agency sends a

letter by messenger to her secretary; however, the letter gets lost with probability 1/4. If the

secretary receives the letter, he sends a copy to each of the two clients. The letter to ABS

has probability 4/5 of being received while the one to BFF has probability 1/6 of being lost.

The letters to the clients, if sent, are received or lost independently of each other. Let S be

the event “secretary receives letter,” A be the event “ABS receives letter,” and B be the event

“BFF receives letter.” Thus, for example P (A|S) = 4/5. If both clients receive their letters,

the mailing is deemed successful.

(a) Find P (B|Sc

).

(b) Find P (A

c

) and P (S|Ac

).

(c) Determine whether or not the events A and B are independent.

(d) Suppose there are n mailings and that the behavior of the system is independent for each

item. Find the mean and variance of the number of successful mailings.

Page 42: Basic Probability Exam Packet

4. (17 points) In this problem you may use any properties of the standard normal density that

you need. Let

f(x, y) =

1

2⇡

p1� ⇢

2exp{�x

2 � 2⇢xy + y

2

2(1� ⇢

2)

}

for x, y 2 < where �1 < ⇢ < 1.

(a) Verify that f(x, y) is a probability density function in the xy-plane.

(b) Let (X,Y) be the random vector whose joint density is f(x, y). Compute E(X). (show the

computation)

(c) Find the marginal distribution of Y .

(d) Find the conditional density of X given Y = y. Which distribution is it?

5. (17 points) Let X1 and X2 be independent, identically distributed random variables having

common pdf

f(x) =

(e

�x

if x � 0

0 if x < 0

Let

Y1 = X1 + X2

Y2 = X1 �X2

(a) Find the joint pdf of Y1 and Y2.

(b) Find the marginal pdf of Y1. What is the distribution of Y1?

(c) Find the conditional pdf of Y2, given Y1 = y1, for some fixed y1.

6. (16 points) Joe and Chris go the the Blue Wall every day and flip a coin to decide who will

buy co↵ee for the other at a price of $1. Joe says they needn’t keep track of the history since

things will “average out” and not matter in the long run. Let T

n

be Joe’s “debt” to Chris after

n trips to the BW (T

n

could be negative). For all of the following questions you must justify

your answers. Let c > 0.

(a) What is lim

n!1 P (|Tn

/n| < c)?

(b) Give a function s(n) so that ↵ = lim

n!1 P (|Tn

/s(n)| < c) satisfies 0 < ↵ < 1. What is

↵? (your answer needn’t be a number, but should be computable from a table).

(c) What is lim

n!1 P (|Tn

| < c)?

(d) Do you agree with Joe’s assertion?

Page 43: Basic Probability Exam Packet

DEPARTMENT OF MATHEMATICS AND STATISTICSUNIVERSITY OF MASSACHUSETTS

BASIC EXAM – PROBABILITYJanuary 22, 2001

Work all problems. 60 points are needed to pass at the Master’s level and 75 to pass at the Ph.D.

level.

1. (20 pts) Two cards are drawn without replacement from a standard deck.

(a) Assuming that the order in which the cards are drawn is important, describe an appropriatesample space for this situation.

(b) Let A1 be the event “first card is an ace,” and A2 be the event “second card is an ace.”Use the definition of conditional probability to compute P (A2|A1).

(c) Consider now an arbitrary sample space ⌦ and two events C and B in ⌦, with P (C) > 0.Suppose that C1, C2, . . . , Ck

are disjoint sets, P (Ci

) > 0, with C = [k

i=1Ci

. Furthersuppose that P (B|C

i

) = P (B|C1) for i = 2, . . . , k. Show that P (B|C) = P (B|C1).

(d) Returning to the card experiment, let B1 be the event “first card is ace of spaces.” ComputeP (A2|B1). How would the answer change if the first card were the ace of diamonds.

2. (20 pts) Consider the following problems involving independent flips of fair coins.

(a) Flip a coin until the last two flips are HT and let N be the number of flips required. Whatis the pdf of N?

(b) Begin by flipping k coins simultaneously. After the first simultaneous flip remove all thecoins which showed heads and flip the remaining coins simultaneously. Continue thisprocess until no coins are left and let T be the number of the last simultaneous flip.Compute the cdf and pdf of T .

3. (20 pts) Let X1, X2, and X3 be independent random variables such that X

i

has a Gammadistribution with paramters ↵

i

, �

i

. That is, X

i

has density

f

i

(x) = 1(0,1)(x)�

↵i

�(↵i

)x

↵i�1e

��ix

where ↵

i

> 0 and �

i

> 0. Let

Y1 =X1

X1 + X2 + X3

Y2 =X2

X1 + X2 + X3

Y3 = X1 + X2 + X3

(a) Find the joint density of Y1, Y2, Y3.

(b) Find the marginal joint density of Y1, Y2.

(c) Show that (Y1, Y2) and Y3 are independent.

4. (15 pts) From an urn containing 10 balls numbered 0 through 9, n balls are drawn withreplacement. Let X

i

= 1 if the ith draw yields the ball numbered 0, and X

i

= 0 otherwise,i = 1, . . . , n.

(a) What does the weak law of large numbers tell you about the occurrence of 0’s in the n

drawings?

Page 44: Basic Probability Exam Packet

(b) Use the central limit theorem to find an approximate probability that, among the n ballsthus chosen, the ball numbered 0 will appear between (n� 3

pn)/10 and (n + 3

pn)/10 if

n = 100.

5. (25 pts)

(a) A random variable has a �

2(⌫) distribution if its pdf is

f(x) =

(1

2⌫/2�(⌫/2)x

⌫/2�1e

�x/2x > 0

0 otherwise

with MGF

M

X

(t) =✓

1

1� 2t

◆⌫/2

Let X1, X2, . . . be independent with X

i

⇠ �

2(⌫i

) What is the pdf of Y =P

n

i=1 X

i

?

(b) A random variable X has moment generating function

M

X

(t) =1

2e

t +1

4e

2t +1

4e

5t

What is the distribution of X? Why?