Next: Consistency Up: Some Properties of Estimators Previous: Unbiasedness   Contents

## Mean Square Error

Definition 8..3
For a random sample from and a statistic
T ) which is an estimator of , the mean square error (mse) is defined as

 (8.2)

The mse can be expressed alternatively as

So we have
 (8.3)

Now from (8.3) we can see that the mse cannot usually be made equal to zero. It will only be small when both Var() and the bias in are small. So rather than use unbiasedness and minimum variance to characterize goodness'' of a point estimator, we might employ the mean square error.

Example 8..1
Consider the problem of the choice of estimator of based on a random sample of size from a distribution. Recall that is often called the sample variance and has the properties

[Note that this is not HC's use of . See 4.1 Definition 3.]

Consider the mle of , , which we'll denote by . Now and

Why is biassed? To calculate we first have to extract the mean, consuming 1 degree of freedom. So we do not have independent estimates of dispersion about the mean; we have .

Now is biased, but what about its mean square error? Using (2.3),

Now for ,

since for an integer greater than 1. So for the normal distribution the mle of is better in the sense of mse than the sample variance.

Next: Consistency Up: Some Properties of Estimators Previous: Unbiasedness   Contents
Bob Murison 2000-10-31