STA301_LEC25

72
Virtual University of Pakistan Lecture No. 25 of the course on Statistics and Probability by Miss Saleha Naghmi Habibullah

description

 

Transcript of STA301_LEC25

Page 1: STA301_LEC25

Virtual University of Pakistan

Lecture No. 25 of the course on

Statistics and Probability

by

Miss Saleha Naghmi Habibullah

Page 2: STA301_LEC25

IN THE LAST LECTURE, YOU LEARNT

• Chebychev’s Inequality• Concept of Continuous Probability Distribution• Distribution Function of a Continuous Probability Distribution

Page 3: STA301_LEC25

TOPICS FOR TODAY• Mathematical Expectation, Variance &

Moments of a Continuous Probability distribution

• BIVARIATE Probability Distribution

Page 4: STA301_LEC25

You will recall that, in the last lecture, we were dealing with an example of a continuous probability distribution in which we were interested in computing a conditional probability.

We now discuss this particular concept:

Page 5: STA301_LEC25

EXAMPLE

a) Find the value of k so that the function f(x) defined as follows, may be a density function

f(x) = kx, 0 < x < 2= 0, elsewhere

b) Compute P(X = 1).

c) Compute P(X > 1).

d) Compute the distribution function F(x).

3/2X3/11/2XP e)

Page 6: STA301_LEC25

SOLUTIONWe hadf(x) = kx, 0 < x < 2

= 0, elsewhereand we obtained k = 1/2. Hence:

elsewhere,0

2x0for,xf 2

x

Page 7: STA301_LEC25

e) Applying the definition of conditional probability, we get

125

13

365

31

232

31

221

4x

4x

32

31

21

31

dx2x

dx2x

XPXP

X|XP32

31

21

31

32

31

21

Page 8: STA301_LEC25

The above example was of the simplest case when the graph of our continuous probability distribution is in the form of a straight line.

Let us now consider a slightly more complicated situation:

Page 9: STA301_LEC25

EXAMPLE A continuous random variable X hasthe d.f. F(x) as follows:

F(x) = 0, for x < 0,

,5x2 2

for 0 < x < 1,

,2x1for,2

xx352

53 2

= 1 for x > 2.Find the p.d.f. and P(|X| < 1.5).

Page 10: STA301_LEC25

SOLUTION

By definition, we have .xFdxdxf

Therefore 5x4xf for 0 < x < 1

x352 for 1 < x < 2

= 0 elsewhere.

Page 11: STA301_LEC25

N o w P | X | < 1 . 5 ) = P ( – 1 . 5 < X < 1 . 5 )

1

0

5.1

1

0

5.1

5.1dx

5x32dx

5x4dx0dx0

12

xx3

5.1

05x2

1

002

52

2

213

225.25.4

52

52

= 0 . 4 0 + 0 . 3 5 = 0 . 7 5 .

Page 12: STA301_LEC25

Let us now discuss the mathematical expectation of continuous random variables through the following example:

Page 13: STA301_LEC25

EXAMPLE

Find the expected value of the random variable X having the p.d.f.

f(x) = 2 (1 – x), 0 < x < 1= 0, elsewhere.

Page 14: STA301_LEC25

SOLUTION

Now

31

31

212

3x

2x2

dxx1x2

dxxfx)X(E

1

0

32

1

0

Page 15: STA301_LEC25

As indicated earlier, the term ‘expected value’ implies the mean value.

The graph of the above probability density function and its mean value are presented in the following figure:

Page 16: STA301_LEC25

10.50 X

f(x)

2

1

0.5

1.5

0.25 0.75

E(X) = 0.33

Page 17: STA301_LEC25

Suppose that we are interested in verifying the properties of mathematical expectation that are valid in the case of univariate probability distributions.

Page 18: STA301_LEC25

You will recall that, in the last lecture, we noted that if X is a discrete random variable and if a and b are constants, then

E(aX + b) = a E(X) + b.

This property is equally valid in the case of continuous probability distributions.

Page 19: STA301_LEC25

In this example, suppose that a = 3 and b = 5. Then, we wish to verify that

E(3X + 5) = 3 E(X) + 5.

The right-hand-side of the above equation is:

3 E(X) + 5 = 3( ) + 5 = 1 + 5 = 631

Page 20: STA301_LEC25

In order to compute the left-hand-side, we proceed as follows:

.6321152

xxx52

dxx3x252

dxx15x32)5X3(E

10

32

21

0

1

0

Page 21: STA301_LEC25

Since the left-hand-side is equal to the right-hand-side, therefore the property is verified.

Page 22: STA301_LEC25

SPECIAL CASE:

We have

E(aX + b) = a E(X) + b.

If b = 0, the above property takes the following simple form:

E(aX) = a E(X).

Page 23: STA301_LEC25

Next, let us consider the computation of the moments and moment-ratios in the case of a continuous probability distribution:

Page 24: STA301_LEC25

EXAMPLEA continuous random variable X has the p.d.f.

otherwise,0

.2x0,x2x43xf

Find the first four moments about the mean and the moment-ratios.

Page 25: STA301_LEC25

We first calculate the moments about origin as:

Page 26: STA301_LEC25

;11216

43

416

316

43

4x

3x2

43dxxx2x

43

dxxfxXE'

2

0

432

2

0

1

Page 27: STA301_LEC25

;56

58

43

5328

43

5x

4x2

43dxxx2x

43

dxxfxXE'

2

0

5422

2

0

222

Page 28: STA301_LEC25

;58

3064

43

664

564

43

6x

5x2

43dxxx2x

43

dxxfxXE'

2

0

6523

2

0

333

Page 29: STA301_LEC25

.7

162164

43

7128

364

43

7x

6x2

43dxxx2x

43

dxxfxXE'

2

0

7624

2

0

444

Page 30: STA301_LEC25

N e x t , w e f i n d t h e m o m e n t s a b o u t t h em e a n a s f o l l o w s :

01

511

56'' 22

122

Page 31: STA301_LEC25

;025

185812

5613

58

'2''3'

3

312133

Page 32: STA301_LEC25

.3533

536

532

716

135616

5814

716

'3''6''4'

42

412

213144

Page 33: STA301_LEC25

The first moment-ratio is

.0

510

3

2

32

23

1

This implies that this particular continuous probability distribution is absolutely symmetric.

Page 34: STA301_LEC25

The second moment-ratio is

.14.2

51353

222

42

This implies that this particular continuous probability distribution may be regarded as playkurtic, i.e. flatter than the normal distribution.

Page 35: STA301_LEC25

The students are encouraged to draw the graph of this distribution in order to develop a visual picture in their minds.

Page 36: STA301_LEC25

We begin the concept of bivariate probability distribution by introducing the term ‘Joint Distributions’:

Page 37: STA301_LEC25

JOINT DISTRIBUTIONS:

The distribution of two or more random variables which are observed simultaneously when an experiment is performed, is called their JOINT distribution.

It is customary to call the distribution of a single random variable as univariate.

Likewise, a distribution involving two, three or many r.v.’s simultaneously is referred to as bivariate, trivariate or multivariate.

Page 38: STA301_LEC25

A bivariate distribution may be discrete when the possible values of (X, U) are finite or countably infinite. It is continuous if (X, Y) can assume all values in some non-countable set of the plane. A bivariate distribution is said mixed when one r.v. is discrete and the other is continuous.

Page 39: STA301_LEC25

Bivariate Probability Function:

Let X and Y be two discrete r.v.’s defined on the same sample space S, X taking the values x1, x2, …, xm

and Y taking the values y1, y2, …, yn.

Page 40: STA301_LEC25

Then the probability that X takes on the value xi and, at the same time, Y takes on the value , denoted by f(xi, yj) or pij, is defined to be the joint probability function or simply the joint distribution of X and Y.

jy

Page 41: STA301_LEC25

Thus the joint probability function, also called the bivariate probability function f(x, y) is a function whose value at the point (xi, yj) is given by

f(xi, yj) = P(X = xi and Y = yj),

i = 1, 2, …, m.j = 1, 2, …, n.

Page 42: STA301_LEC25

The joint or bivariate probability distribution consisting of all pairs of values (xi, yj) and their associated probabilities f(xi, yj) i.e. the set of triples [xi, yj, f(xi, yj)] can either be shown in the following two-way table:

Page 43: STA301_LEC25

Joint Probability Distribution of X and YX\Y y1 y2 … yj … yn P(X = x i)

x1 f(x1,y1) f(x1,y2) … f(x1,yj) … f(x1,yn) g(x1)

x2 f(x2,y1) f(x2,y2) … f(x2,yj) … f(x2,yn) g(x2)

xi f(xi,y1) f(xi,y2) … f(xi,yj) … f(xi,yn) g(xi)

xm f(xm,y1) f(xm,y2) … f(xm,yj) … f(xm,yn) g(xm)

P(Y=yj) h(y1) h(y2) … h(yj) … h(yn) 1

Page 44: STA301_LEC25

or be expressed by mean of a formula for f(x, y). The probabilities f(x, y) can be obtained by substituting appropriate values of x and y in the table or formula.

Page 45: STA301_LEC25

A joint probability function has the following properties:

Page 46: STA301_LEC25

PROPERTIES:i) f(xi, yj) > 0, for all

(xi, yj), i.e. for i = 1, 2, …, m; j = 1, 2, …, n.

ii) i j

ji 1y,xf

Page 47: STA301_LEC25

Next, we consider the concept of MARGINAL PROBABILITY FUNCTIONS:

Page 48: STA301_LEC25

The point to be understood here is that, from the joint probability function for (X, Y), we can obtain the INDIVIDUAL probability function of X and Y. Such individual probability functions are called MARGINAL probability functions.

Page 49: STA301_LEC25

= f(xi, y1) + f(xi, y2) + … + f(xi, yn)

as xi must occur either with y1 or y2 or … or yn.

= P(X = xi);

n

1jjii y,xfxg

Let f(x, y) be the joint probability function of two discrete r.v.’s X and Y. Then the marginal probability function of X is defined as

Page 50: STA301_LEC25

that is, the individual probability function of X is found by adding over the rows of the two-way table.

Page 51: STA301_LEC25

Similarly, the marginal probability function for Y is obtained by adding over the column as

jm

1ijij yYPy,xfyh

Page 52: STA301_LEC25

The values of the marginal probabilities are often written in the margins of the joint table as they are the row and column totals in the table. The probabilities in each marginal probability function add to 1.

Page 53: STA301_LEC25

Next, we consider the concept of CONDITIONAL PROBABILITY FUNCTION:

Page 54: STA301_LEC25

Let X and Y be two discrete r.v.’s with joint probability function f(x, y).

Then the conditional probability function for X given Y = y, denoted as f(x|y), is defined by

Page 55: STA301_LEC25

f(xi | yj) = P(X = xi | Y = yj)

jji

yYPyYandxXP

,yh

y,xf

j

ji

for i = 1, 2, …, j = 1, 2, …where h(y) is the marginal probability, and h(y) > 0.

Page 56: STA301_LEC25

It gives the probability that X takes on the value xi given that Y has taken on the value yj.

The conditional probability f(xi | yj) is non-negative and (for a given fixed yj) adds to 1 on i and hence is a probability function.

Page 57: STA301_LEC25

Similarly, the conditional probability function for Y given X = x is

f(yj | xi) = P(Y = yj | X = xi)

iij

xXPxXandyYP

,xgy,xf

i

ji where g(x) > 0.

Page 58: STA301_LEC25

INDEPENDENCE:

Two discrete r.v.’s X and Y are said to be statistically independent, if and only if, for all possible pairs of values (xi, yj) the joint probability function f(x, y) can be expressed as the product of the two marginal probability functions.

Page 59: STA301_LEC25

f(x, y) = P(X = xi and Y = yj)

= P(X = xi). P(Y = yj)for all i and j.

= g(x) h(y).

That is, X and Y are independent, if

Page 60: STA301_LEC25

It should be noted that the joint probability function of X and Y when they are independent, can be obtained by MULTIPLYING together their marginal probability functions.

Page 61: STA301_LEC25

Let us now illustrate all these concepts with the help of an example:

Page 62: STA301_LEC25

EXAMPLE:

An urn contains 3 black, 2 red and 3 green balls and 2 balls are selected at random from it. If X is the number of black balls and Y is the number of red balls selected, then find

Page 63: STA301_LEC25

i) the joint probabilityfunction f(x, y);

ii) P(X + Y < 1); iii) the marginal p.d. g(x)

and h(y);iv) the conditional p.d. f(x | 1),v) P(X = 0 | Y = 1); andvi) Are x and Y independent?

Page 64: STA301_LEC25

i) The sample space S for this experiment contains sample points. The possible values of X are 0, 1, and 2, and those for Y are 0, 1, and 2.

The values that (X, Y) can take on are (0, 0), (0, 1), (1, 0), (1, 1), (0, 2) and (2, 0). We desire to find f(x, y) for each value (x, y).

Page 65: STA301_LEC25

The total number of ways in which 2 balls can be drawn out of a total of 8 balls is

.282

7x882

f(0, 0) = P(X = 0 and Y = 0) = 3/28

Page 66: STA301_LEC25

Now f(0, 0) = P(X = 0 and Y = 0), where the event (X = 0 and Y = 0) represents that neither black nor red ball is selected, implying that the 2 selected are green balls. This event therefore contains

sample points, and

332

20

30

Page 67: STA301_LEC25

f(0, 0) = P(X = 0 and Y = 0) = 3/28

Again f(0, 1) = P(X = 0 and Y = 1)

= P (none is black, 1 is red and 1 is green)

286

28

31

21

30

Page 68: STA301_LEC25

S im i l a r ly , f ( 1 , 1 )= P ( X = 1 a n d Y = 1 )= P ( 1 i s b l a c k 1 i s r e d a n d n o n e i s g r e e n )

286

28

30

21

31

Page 69: STA301_LEC25

Similar calculations give the probabilities of other values and the joint probability function of X and Y is given as:

Page 70: STA301_LEC25

Joint Probability DistributionY

X0 1 2 P(X = xi)

g(x)

0 3/28 6/28 1/28 10/281 9/28 6/28 0 15/282 3/28 0 0 3/28

P(Y = yj)h(y) 15/28 12/28 1/28 1

Page 71: STA301_LEC25

IN TODAY’S LECTURE, YOU LEARNT

• Mathematical Expectation, Variance & Moments of Continuous Probability Distributions

•BIVARIATE Probability Distribution (Discrete case)

Page 72: STA301_LEC25

IN THE NEXT LECTURE, YOU WILL LEARN

•BIVARIATE Probability Distributions (Discrete and Continuous)

• Properties of Expected Values in the case of Bivariate Probability Distributions

• Covariance & Correlation