Point Estimation - math.utah.edulzhang/teaching... · Point Estimation De nition A point estimate...

Post on 10-Jun-2020

5 views 0 download

Transcript of Point Estimation - math.utah.edulzhang/teaching... · Point Estimation De nition A point estimate...

Point Estimation

Example (a variant of Problem 62, Ch5)Manufacture of a certain component requires three differentmaching operations. The total time for manufacturing one suchcomponent is known to have a normal distribution. However, themean µ and variance σ2 for the normal distribution are unknown.If we did an experiment in which we manufactured 10 componentsand record the operation time, and the sample time is given as

following:

1 2 3 4 5time 63.8 60.5 65.3 65.7 61.9

6 7 8 9 10time 68.2 68.1 64.8 65.8 65.4

What can we say about the population mean µ and populationvariance σ2?

Point Estimation

Example (a variant of Problem 62, Ch5)Manufacture of a certain component requires three differentmaching operations. The total time for manufacturing one suchcomponent is known to have a normal distribution. However, themean µ and variance σ2 for the normal distribution are unknown.If we did an experiment in which we manufactured 10 componentsand record the operation time, and the sample time is given as

following:

1 2 3 4 5time 63.8 60.5 65.3 65.7 61.9

6 7 8 9 10time 68.2 68.1 64.8 65.8 65.4

What can we say about the population mean µ and populationvariance σ2?

Point Estimation

Example (a variant of Problem 64, Ch5)Suppose the waiting time for a certain bus in the morning isuniformly distributed on [0, θ], where θ is unknown. If we record 10waiting times as follwos:

1 2 3 4 5time 7.6 1.8 4.8 3.9 7.1

6 7 8 9 10time 6.1 3.6 0.1 6.5 3.5

What can we say about the parameter θ?

Point Estimation

Example (a variant of Problem 64, Ch5)Suppose the waiting time for a certain bus in the morning isuniformly distributed on [0, θ], where θ is unknown. If we record 10waiting times as follwos:

1 2 3 4 5time 7.6 1.8 4.8 3.9 7.1

6 7 8 9 10time 6.1 3.6 0.1 6.5 3.5

What can we say about the parameter θ?

Point Estimation

DefinitionA point estimate of a parameter θ is a single number that can beregarded as a sensible value for θ. A point estimate is obtained byselecting a suitable statistic and computing its value from thegiven sample data. The selected statistic is called the pointestimator of θ.

e.g. X =∑10

i=1 Xi/10 is a point estimator for µ for the normaldistribution example.The largest sample data X10,10 is a point estimator for θ for theuniform distribution example.

Point Estimation

DefinitionA point estimate of a parameter θ is a single number that can beregarded as a sensible value for θ. A point estimate is obtained byselecting a suitable statistic and computing its value from thegiven sample data. The selected statistic is called the pointestimator of θ.

e.g. X =∑10

i=1 Xi/10 is a point estimator for µ for the normaldistribution example.The largest sample data X10,10 is a point estimator for θ for theuniform distribution example.

Point Estimation

DefinitionA point estimate of a parameter θ is a single number that can beregarded as a sensible value for θ. A point estimate is obtained byselecting a suitable statistic and computing its value from thegiven sample data. The selected statistic is called the pointestimator of θ.

e.g. X =∑10

i=1 Xi/10 is a point estimator for µ for the normaldistribution example.

The largest sample data X10,10 is a point estimator for θ for theuniform distribution example.

Point Estimation

DefinitionA point estimate of a parameter θ is a single number that can beregarded as a sensible value for θ. A point estimate is obtained byselecting a suitable statistic and computing its value from thegiven sample data. The selected statistic is called the pointestimator of θ.

e.g. X =∑10

i=1 Xi/10 is a point estimator for µ for the normaldistribution example.The largest sample data X10,10 is a point estimator for θ for theuniform distribution example.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?There are a few criteria for us to select the best point estimator:unbiasedness,minimum variance,and mean square error.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?

There are a few criteria for us to select the best point estimator:unbiasedness,minimum variance,and mean square error.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?There are a few criteria for us to select the best point estimator:

unbiasedness,minimum variance,and mean square error.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?There are a few criteria for us to select the best point estimator:unbiasedness,

minimum variance,and mean square error.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?There are a few criteria for us to select the best point estimator:unbiasedness,minimum variance,

and mean square error.

Point Estimation

Problem: when there are more then one point estimator forparameter θ, which one of them should we use?There are a few criteria for us to select the best point estimator:unbiasedness,minimum variance,and mean square error.

Point Estimation

DefinitionA point estimator θ̂ is said to be an unbiased estimator of θ ifE (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, thedifference E (θ̂)− θ is called the bias of θ̂.

Principle of Unbiased Estimation

When choosing among several different estimators of θ, select onethat is unbiased.

Point Estimation

DefinitionA point estimator θ̂ is said to be an unbiased estimator of θ ifE (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, thedifference E (θ̂)− θ is called the bias of θ̂.

Principle of Unbiased Estimation

When choosing among several different estimators of θ, select onethat is unbiased.

Point Estimation

DefinitionA point estimator θ̂ is said to be an unbiased estimator of θ ifE (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, thedifference E (θ̂)− θ is called the bias of θ̂.

Principle of Unbiased Estimation

When choosing among several different estimators of θ, select onethat is unbiased.

Point Estimation

Proposition

Let X1,X2, . . . ,Xn be a random sample from a distribution withmean µ and variance σ2. Then the estimators

µ̂ = X =

∑ni=1 Xi

nand σ̂2 = S2 =

∑ni=1(Xi − X )2

n − 1

are unbiased estimator of µ and σ2, respectively.

If in addition the distribution is continuous and symmetric, then X̃and any trimmed mean are also unbiased estimators of µ.

Point Estimation

Proposition

Let X1,X2, . . . ,Xn be a random sample from a distribution withmean µ and variance σ2. Then the estimators

µ̂ = X =

∑ni=1 Xi

nand σ̂2 = S2 =

∑ni=1(Xi − X )2

n − 1

are unbiased estimator of µ and σ2, respectively.

If in addition the distribution is continuous and symmetric, then X̃and any trimmed mean are also unbiased estimators of µ.

Point Estimation

Principle of Minimum Variance Unbiased Estimation

Among all estimators of θ that are unbiased, choose the one thathas minimum variance. The resulting θ̂ is called the minimumvariance unbiased estimator (MVUE) of θ.

TheoremLet X1,X2, . . . ,Xn be a random sample from a normal distributionwith mean µ and variance σ2. Then the estimator µ̂ = X is theMVUE for µ.

Point Estimation

Principle of Minimum Variance Unbiased Estimation

Among all estimators of θ that are unbiased, choose the one thathas minimum variance. The resulting θ̂ is called the minimumvariance unbiased estimator (MVUE) of θ.

TheoremLet X1,X2, . . . ,Xn be a random sample from a normal distributionwith mean µ and variance σ2. Then the estimator µ̂ = X is theMVUE for µ.

Point Estimation

Principle of Minimum Variance Unbiased Estimation

Among all estimators of θ that are unbiased, choose the one thathas minimum variance. The resulting θ̂ is called the minimumvariance unbiased estimator (MVUE) of θ.

TheoremLet X1,X2, . . . ,Xn be a random sample from a normal distributionwith mean µ and variance σ2. Then the estimator µ̂ = X is theMVUE for µ.

Point Estimation

DefinitionLet θ̂ be a point estimator of parameter θ. Then the quantityE [(θ̂ − θ)2] is called the mean square error (MSE) of θ̂.

Proposition

MSE = E [(θ̂ − θ)2] = V (θ̂) + [E (θ̂)− θ]2

Point Estimation

DefinitionLet θ̂ be a point estimator of parameter θ. Then the quantityE [(θ̂ − θ)2] is called the mean square error (MSE) of θ̂.

Proposition

MSE = E [(θ̂ − θ)2] = V (θ̂) + [E (θ̂)− θ]2

Point Estimation

DefinitionLet θ̂ be a point estimator of parameter θ. Then the quantityE [(θ̂ − θ)2] is called the mean square error (MSE) of θ̂.

Proposition

MSE = E [(θ̂ − θ)2] = V (θ̂) + [E (θ̂)− θ]2

Point Estimation

DefinitionThe standard error of an estimator θ̂ is its standard deviation

σθ̂

=√

V (θ̂). If the standard error itself involves unknownparameters whose values can be estimated, substitution of theseestimates into σθ̂ yields the estimated standard error (estimatedstandard deviation) of the estimator. The estimated standard errorcan be denoted either by σ̂

θ̂or by s

θ̂.

Point Estimation

DefinitionThe standard error of an estimator θ̂ is its standard deviation

σθ̂

=√

V (θ̂). If the standard error itself involves unknownparameters whose values can be estimated, substitution of theseestimates into σθ̂ yields the estimated standard error (estimatedstandard deviation) of the estimator. The estimated standard errorcan be denoted either by σ̂

θ̂or by s

θ̂.