next up previous contents
Next: Joint Distribution of and Up: Order Statistics Previous: Distribution of Order Statistics   Contents

Marginal Density Functions

Before engaging in the theory we do a sketch of the information we are modelling.

When we derive the distribution of a single order statistic, we divide the underlying distribution into 3.

Figure 5.2: An order statistic in an underlying distribution
\includegraphics[width=10cm,height=6cm]{NOTES/DISTNTH/ORDERSTATS/ostats.2}

The observed value of the $r$th order statistic is $y_r$. ie $Y_r = y_r$. This is a random variable ($Y_r$ will have a different value from a new sample) with density $f(y_r)$. We have

  1. $(r-1)$ observations $ < y_r$ with probability $F(y_r)$,
  2. 1 observation with density $f(y_r)$,
  3. $(n-r)$ observations $ > y_r$ with probability $1-F(y_r)$.

Ordering and classifying by $Y_r$ has produced a form similar to a multinomial distribution with 3 categories , $ < y_r$,$= y_r$,$ > y_r$, and associated with these categories we have the entities $F(y_r)$,$f(y_r)$,$1-F(y_r)$.

The multinomial density function for 3 categories is

\begin{displaymath}P(X_1=x_1,X_2=x_2,X_3=x_3) = {{n!} \over{n_1!n_2!n_3!}}\times p_1^{n_1}p_s^{n_2}p_3^{n_3} \ \ .\end{displaymath}

The density of an order statistic has a similar form,

\begin{displaymath}f_{Y_r}(y) = {{n!} \over{(r-1)!1!(n-r)!}}\times
\left[F(y_r...
...r-1)} \times f(y_r) \times \left[1 - F(y_r)\right]^{(n-r)} \ . \end{displaymath}

Note there are 3 components of the density corresponding to the 3 categories.

For 2 order statistics, $Y_r, Y_s$, there are 5 categories,

  1. $y < y_r$
  2. $y = y_r$
  3. $y_r < y < y_s$
  4. $y = y_s$
  5. $y > y_s$

Figure 5.3: Two order statistics in an underlying distribution
\includegraphics[width=10cm,height=6cm]{NOTES/DISTNTH/ORDERSTATS/ostats.3}

From the same analogy to multinomials used for a single order statistic, there are 5 components to the density,

\begin{eqnarray*}
\lefteqn{f_{Y_r,Y_s}(y_r,y_s) = {{n!} \over{(r-1)!1!(n-r-1)!1!...
...right]^{s-r-1}\times f(y_s)
\left[1 - F(y_s)\right]^{(n-s)} \ .
\end{eqnarray*}



Formal derivation of the densities of order statistics

Since we know the pdf of ${\bf Y} = (Y_1, Y_2, \dots , Y_n)$ is given by (5.1), the marginal pdf of the $r$th smallest component, $Y_r$, can be found by integrating over the remaining $(n-1)$ variables. Thus

\begin{displaymath}
f_{Y_r}(y_r)= \int^{y_r}_{-\infty} \int^{y_{r-1}}_{-\infty}...
...1} f(y_i)dy_n \dots dy_{r+1} \right] dy_1 \dots dy_{r-1} \ .
\end{displaymath} (5.2)

(The parentheses are inserted as a guide to the integration - they are not actually required.)

Notice the order of integration used is to first integrate over $y_n$, then $y_{n-1}, \dots$ and then $y_{r+1}$ (this is the part of (5.2) enclosed by the parentheses). This is followed by integration over $y_1$, then $y_2,
\dots $, and finally over $y_{r-1}$. The limits of integration are otained from the inequalities.

\begin{displaymath}\infty > y_n > y_{n-1} > \dots > y_{r+1} > y_r \end{displaymath}

and

\begin{displaymath}-\infty < y_1 < y_2 < \dots <y_{r-1} < y_r \ . \end{displaymath}

In order to integrate (5.1), we first have
$\displaystyle {\int^\infty_{y_r} \dots \int ^\infty_{y_{n-2}}
\int^\infty_{y_{n-1}} \prod^n_{i=r+1} f(y_i)dy_n\dots dy_{r+1}}$
  $\textstyle =$ $\displaystyle \int^\infty_{y_r} \dots \int^\infty_{y_{n-2}} [1-F(y_{n-1})]
f(y_{n-1}) \prod^{n-2}_{i=r+1} f(y_i)dy_{n-1} \dots dy_{r+1}$  
  $\textstyle =$ $\displaystyle \int^\infty_{y_r} \dots \int^\infty_{y_{n-3}} \frac{[1-F(y_{n-
2})]^2}{2!} f(y_{n-2}) \prod^{n-3}_{i=r+1} f(y_i)dy_{n-2} \dots dy_{r+1}$  
  $\textstyle =$ $\displaystyle \frac{[1-F(y_r)]^{n-r}}{(n-r)!} , \ \mbox{on
simplification.}$  
    $\displaystyle ~~~~~$ (5.3)

Similarly

$\displaystyle \int^{y_r}_{-\infty} \int^{y_{r-1}}_{-\infty} \dots \int^{y_2}_{-
\infty} \prod^{r-1}_{i=1} f(y_i)dy_1\dots dy_{r-1}$ $\textstyle =$ $\displaystyle \int^{y_r}_{-
\infty} \dots \int^{y_3}_{-\infty} F(y_2)f(y_2) \prod^{r-1}_{i=3} f(y_i)dy_2
\dots dy_{r-1}$  
  $\textstyle =$ $\displaystyle \int^{y_r}_{-\infty}\dots \int^{y_4}_{-\infty}
\frac{[F(y_3)]^2}{2!} f(y_3) \prod^{r-1}_{i=4} f(y_i)dy_3 \dots dy_{r-1}$  
  $\textstyle =$ $\displaystyle [F(y_r)]^{r-1} / (r-1)!, \ \ \mbox{on simplification.}$ (5.4)

Hence using (5.3) and (5.3) in (5.2), we obtain

\begin{eqnarray*}
f_{Y_r}(y_r)& = & n! f(y_r) \int^{y_r}_{-\infty}\dots
\int^{...
...frac{[1-F(y_r)]^{n-r}}{(n-r)!}.
\frac{[F(y_r)]^{r-1}}{(r-1)!}
\end{eqnarray*}



so that the marginal p.d.f. of $Y_r$ is given by

\fbox{\usebox{\savepar}}



The probability density functions of both the minimum observation $(r=1)$ and the maximum observation $(r=n)$ are special cases of (5.5).



\fbox{\usebox{\savepar}}



\fbox{\usebox{\savepar}}



The integration technique can be applied to find the joint pdf of two (or more) order statistics, and this is done in 5.4. Before examining that, we will give an alternative (much briefer) derivation of (5.7).

Let the cdf of $Y_n$ be denoted by $F_{Y_n}$. For any value $y$ in the range space of $Y_n$, the cdf of $Y_n$ is

\begin{eqnarray*}
F_{Y_n}(y)&=&P(Y_n \leq y)=\mbox{P(all $n$\ observations }\leq
y)\\
&=&[\mbox{P(an observation }\leq y)]^n\\
&=&[F(y)]^n
\end{eqnarray*}



The pdf of $Y_n$ is thus $f_{Y_n}(y)=F'_{Y_n}(y)=n[F(y)]^{n-1}.f(y), \, a<y<b.$

Of course $y$ in the above is just a dummy, and could be replaced by $y_n$ to give (5.7).


Exercise Use this technique to prove (5.6).


Example 5..1
Let $X_1, \ldots X_n$ be a sample from the uniform distribution $f(x)=1/\theta \ ,\ 0<x< \theta$. Find a $100 (1-\alpha)\% $ CI for $\theta$ using the largest order statistic, $Y_n$.

By definition,

\begin{displaymath}0<Y_1<Y_2 < \ldots < Y_n < \theta \ .\end{displaymath}

So $Y_n$ will suffice as the lower limit for $\theta$. Given the information gleaned from the order statistics, what is the upper limit? Using the above result for the density of the largest order statistic,

\begin{eqnarray*}
f_{Y_N}(y_n) & = & n \left[F(y_n)\right] ^{n-1} f(y_n) \\
& = & n {{y_n^{n-1}} \over {\theta^n}}
\end{eqnarray*}



Choose $c$ such that,

\begin{eqnarray*}
P(c \theta < Y_n < \theta) &=& 1-\alpha \\
\int_{c \theta}^\t...
...
1 - c^n&=& 1 - \alpha \Rightarrow c = \alpha ^{\frac{1}{n}} \\
\end{eqnarray*}



Therefore,

\begin{eqnarray*}
P\left(\theta \alpha^{\frac{1}{n}} < Y_n < \theta \right ) &=&...
...ta < \frac{Y_n}{\alpha^{\frac{1}{n}}} \right)
&=& 1- \alpha \\
\end{eqnarray*}



A $100 (1-\alpha)\% $ CI for $\theta$ is given by $(Y_n,Y_n \alpha ^{-\frac{1}{n}} )$.


next up previous contents
Next: Joint Distribution of and Up: Order Statistics Previous: Distribution of Order Statistics   Contents
Bob Murison 2000-10-31