Next: Properties of Maximum Likelihood Up: Cramér-Rao Lower Bound Previous: Minimum Variance Estimation   Contents

## When can the MVB be Attained?

It is easy to establish the condition under which the minimum variance bound of an unbiased estimator, (8.9), is achieved. In the proof of Theorem8.2, it should be noted that the inequality concerning the correlation of and becomes an equality (that is, or ) when V is a linear function of T. Recalling that , we may write this condition as

where is independent of the observations but may be a function of , so we will write it as . So the condition for the MVB to be attained is that the statistic Tt( ) satisfies
 (8.11)

Example 8..5
In the problem of estimating in a normal distribution with mean and known variance , where is known, show that the MVB of an unbiased estimator can be attained.

As in Example 8.2,

Now defining , we know it is an unbiased estimator of , and we see that (8.11) is satisfied, where , (A not being a function of in this case). Thus the minimum variance bound can be attained.

Comment. In the case of an unbiased estimator T where the MVB is attained, note that the inequality in (8.9) becomes an equality and we have

 (8.12)

Also, squaring (8.11) and taking expectations of both sides, we have

That is,

giving

So, if the statistic satisfies (8.11), Var(T) can be identified immediately as the multiple of on the RHS. For instance, in Example 8.2, the factor multiplying can be identified as the reciprocal of the variance of , and it was not necessary to evaluate the MVB as in Example 8.2.

Example 8..6
Consider the problem of estimating the variance, , of a normal distribution with known mean , based on a sample of size .

Now the likelihood is

which is in the form (2.11) where . So, using this as the estimate of , the MVB is achieved and it is .

Note that so T is an unbiased estimator of . Also, so has variance . Hence,

Example 8..7

Consider the problem where we have a random sample from a Poisson distribution with parameter and we wish to find the Cramér-Rao lower bound for the variance of an unbiased estimator of , and identify the estimator that has his variance.

Now for , the likelihood of the sample is

where is the statistic. This is in the correct form for the minimum variance bound to be attained and it is . We note that is an estimator which has variance .

Next: Properties of Maximum Likelihood Up: Cramér-Rao Lower Bound Previous: Minimum Variance Estimation   Contents
Bob Murison 2000-10-31