1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

32
1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS

Transcript of 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

Page 1: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

1

STATISTICAL INFERENCEPART II

SOME PROPERTIES OF ESTIMATORS

Page 2: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

SOME PROPERTIES OF ESTIMATORS

• θ: a parameter of interest; unknown

• Previously, we found good(?) estimator(s) for θ or its function g(θ).

• Goal:

• Check how good are these estimator(s). Or are they good at all?

• If more than one good estimator is available, which one is better?

2

Page 3: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

3

SOME PROPERTIES OF ESTIMATORS

• UNBIASED ESTIMATOR (UE): An estimator is an UE of the unknown parameter , if

ˆE for all

Otherwise, it is a Biased Estimator of .

ˆ ˆBias E

Bias of for estimating

If is UE of , ˆ 0.Bias

Page 4: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

4

SOME PROPERTIES OF ESTIMATORS

• ASYMPTOTICALLY UNBIASED ESTIMATOR (AUE): An estimator is an AUE of the unknown parameter , if

ˆ ˆ0 lim 0n

Bias but Bias

Page 5: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

5

SOME PROPERTIES OF ESTIMATORS

• CONSISTENT ESTIMATOR (CE): An estimator which converges in probability to an unknown parameter for all is called a CE of .

ˆ .p

• MLEs are generally CEs.

For large n, a CE tends to be closer to the unknown population parameter.

Page 6: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

6

EXAMPLE

For a r.s. of size n,

E X X is an UE of . By WLLN,

pX

X is a CE of .

Page 7: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

7

MEAN SQUARED ERROR (MSE)• The Mean Square Error (MSE) of an

estimator for estimating is

22ˆ ˆ ˆ ˆMSE E Var Bias

If is smaller, is a better estimator

of .

ˆMSE

1 2ˆ ˆ ,For two estimators, and of if

1 2ˆ ˆ ,MSE MSE

1 is better estimator of .

Page 8: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

8

MEAN SQUARED ERROR CONSISTENCY

is called mean squared error consistent (or consistent in quadratic mean) if E{ }2 0 as n.

Theorem: is consistent in MSE iff

i) Var( )0 as n.

If E{ }20 as n, is also a CE of .

)ˆ(lim) Eii

n

Page 9: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

9

EXAMPLES

X~Exp(), >0. For a r.s of size n, consider the following estimators of , and discuss their bias and consistency.

1ˆ,ˆ 12

11

n

X

n

Xn

ii

n

ii

Page 10: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

10

SUFFICIENT STATISTICS

• X, f(x;),

• X1, X2,…,Xn

• Y=U(X1, X2,…,Xn ) is a statistic.

• A sufficient statistic, Y, is a statistic which contains all the information for the estimation of .

Page 11: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

11

SUFFICIENT STATISTICS

• Given the value of Y, the sample contains no further information for the estimation of .

• Y is a sufficient statistic (ss) for if the conditional distribution h(x1,x2,…,xn|y) does not depend on for every given Y=y.

• A ss for is not unique:

• If Y is a ss for , then any 1-1 transformation of Y, say Y1=fn(Y) is also a ss for .

Page 12: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

12

SUFFICIENT STATISTICS• The conditional distribution of sample rvs

given the value of y of Y, is defined as

1 21 2

; , , ,, , ,

;n

n

L x x xh x x x y

g y

1 21 2

, , , , ;, , ,

;n

n

f x x x yh x x x y

g y

• If Y is a ss for , then

1 21 2 1 2

; , , ,, , , , , ,

;n

n n

L x x xh x x x y H x x x

g y

ss for may include y or constant.

Not depend on for every given y.

• Also, the conditional range of Xi given y not depend on .

Page 13: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

13

SUFFICIENT STATISTICS

EXAMPLE: X~Ber(p). For a r.s. of size n, show that is a ss for p.

n

1iiX

Page 14: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

14

SUFFICIENT STATISTICS

• Neyman’s Factorization Theorem: Y is a ss for iff

1 2 1 2; , , , nL k y k x x x

where k1 and k2 are non-negative functions.

The likelihood function Does not depend on xi

except through y

Not depend on (also in the range of xi.)

Page 15: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

15

EXAMPLES

1. X~Ber(p). For a r.s. of size n, find a ss for p if exists.

Page 16: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

16

EXAMPLES

2. X~Beta(θ,2). For a r.s. of size n, find a ss for θ.

Page 17: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

17

SUFFICIENT STATISTICS

• A ss, that reduces the dimension, may not exist.

• Jointly ss (Y1,Y2,…,Yk ) may be needed. Example: Example 10.2.5 in Bain and Engelhardt (page 342 in 2nd edition), X(1) and X(n)

are jointly ss for

• If the MLE of exists and unique and if a ss for exists, then MLE is a function of a ss for .

Page 18: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

18

EXAMPLE

X~N(,2). For a r.s. of size n, find jss for and 2.

Page 19: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

MINIMAL SUFFICIENT STATISTICS

• If is a ss for θ, then,

is also a ss

for θ. But, the first one does a better job in data reduction. A minimal ss achieves the greatest possible reduction.

19

))x(s),...,x(s()x(S~

k~

1~

))x(s),...,x(s),x(s()x(S~

k~

1~

0~

*

Page 20: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

20

MINIMAL SUFFICIENT STATISTICS• A ss T(X) is called minimal ss if, for any

other ss T’(X), T(x) is a function of T’(x).• THEOREM: Let f(x;) be the pmf or pdf of

a sample X1, X2,…,Xn. Suppose there exist a function T(x) such that, for two sample points x1,x2,…,xn and y1,y2,…,yn, the ratio

is constant with respect to iff T(x)=T(y). Then, T(X) is a minimal sufficient statistic for .

1 2

1 2

, , , ;

, , , ;n

n

f x x x

f y y y

Page 21: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

21

EXAMPLE

• X~N(,2) where 2 is known. For a r.s. of size n, find minimal ss for .

Note: A minimal ss is also not unique. Any 1-to-1 function is also a minimal ss.

Page 22: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

22

ANCILLARY STATISTIC

• A statistic S(X) whose distribution does not depend on the parameter is called an ancillary statistic.

• Unlike a ss, an ancillary statistic contains no information about .

Page 23: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

Example

• Example 6.1.8 in Casella & Berger, page 257:

Let Xi~Unif(θ,θ+1) for i=1,2,…,n

Then, range R=X(n)-X(1) is an ancillary statistic because its pdf does not depend on θ.

23

Page 24: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

24

COMPLETENESS• Let {f(x; ), } be a family of pdfs (or pmfs)

and U(x) be an arbitrary function of x not depending on . If

requires that the function itself equal to 0 for all possible values of x; then we say that this family is a complete family of pdfs (or pmfs).

0 for all E U X

0 for all 0 for all .E U X U x x

i.e., the only unbiased estimator of 0 is 0 itself.

Page 25: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

25

EXAMPLES

1. Show that the family {Bin(n=2,); 0<<1} is complete.

Page 26: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

26

EXAMPLES

2. X~Uniform(,). Show that the family {f(x;), >0} is not complete.

Page 27: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

27

COMPLETE AND SUFFICIENT STATISTICS (css)

• Y is a complete and sufficient statistic (css) for if Y is a ss for and the family

; ;g y is complete. The pdf of Y.

1) Y is a ss for .

2) u(Y) is an arbitrary function of Y. E(u(Y))=0 for all implies that u(y)=0 for all possible Y=y.

Page 28: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

28

BASU THEOREM• If T(X) is a complete and minimal sufficient

statistic, then T(X) is independent of every ancillary statistic.

• Example: X~N(,2).

: the mss for X

(n-1)S2/ 2 ~2

1n Ancillary statistic for

By Basu theorem, and S2 are independent.X

S2

statisticcompleteaisX

.familycompleteis)n/,(Noffamilyand)n/,(N~X 22

Page 29: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

BASU THEOREM

• Example:

• Let T=X1+ X2 and U=X1 - X2

• We know that T is a complete minimal ss.

• U~N(0, 22) distribution free of T and U are independent by Basu’s Theorem

29

X1, X2~N(,2), independent, 2 known.

Page 30: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

Problems

• Let be a random sample from a Bernoulli distribution with parameter p.

a)Find the maximum likelihood estimator (MLE) of p. 

b)Is this an unbiased estimator of p?

30

1 2, ,..., nX X X

Page 31: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

Problems

• If Xi are normally distributed random variables with mean μ and variance σ2, what is an unbiased estimator of σ2?

31

Page 32: 1 STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS.

Problems

• Suppose that are i.i.d. random variables on the interval [0; 1] with the density function,

where α> 0 is a parameter to be estimated from the sample. Find a sufficient statistic for α.

32

1 2, ,..., nX X X