next up previous contents
Next: Preliminaries Up: Prerequisites Previous: Probability concepts assumed known   Contents


Assumed knowledge of matrices and vector spaces

  1. Use of terms singular, diagonal, unit, null, symmetric.
  2. Operations of addition, subtraction, multiplication, inverse and transpose.

    [We will use $A'$ for the transpose of $A$.]

    1. $(AB)'=B'A'$,
    2. $(AB)^{-1}=B^{-1}A^{-1}$
    3. $(A^{-1})'=(A')^{-1}$.
  3. The trace of a matrix $A$, written tr$(A)$, is defined as the sum of the diagonal elements of A. That is,

    \begin{displaymath}\mbox{tr}(A) = \sum_i a_{ii}. \end{displaymath}

    1. tr$(A \pm B)=$tr$(A) \pm $tr$(B)$,
    2. tr$(AB)=$tr$(BA)$.
  4. Linear Independence and Rank
    1. Let ${\bf x}_1$, ..., ${\bf x}_n$ be a set of vectors and $c_1$, ..., $c_n$ be scalar constants. If $\sum_i c_i {\bf x}_i=0$ only if $c_1=c_2=\ldots =c_r=0$, the the set of vectors is linearly independent.
    2. The rank of a set of vectors is the maximum number of linearly independent vectors in the set.
    3. For a square matrix $A$, the rank of $A$, denoted by r$(A)$, is the maximum order of non-zero subdeterminants.
    4. r$(AA')=$r$(A'A)=$r$(A)=$r$(A')$,
  5. Quadratic Forms

    For a p-vector x, where ${\bf x}'=(x_1, \ldots, x_p)$, and a square $p \times p$ matrix $A$,

    \begin{displaymath}{\bf x}'A{\bf x}=\sum_{i,j=1}^{p}a_{ij}x_ix_j \end{displaymath}

    is a quadratic form in $x_1, \ldots, x_n.$

    The matrix $A$ and the quadratic form are called:

    1. positive semidefinite if ${\bf x}'A{\bf x} \geq 0$ for all $\bf x$ and ${\bf x}'A{\bf x}=0$ for some ${\bf x} \neq 0$.
    2. positive definite if ${\bf x}'A{\bf x}>0$ for all ${\bf x} \neq 0$.
      1. A necessary and sufficient condition for $A$ to be positive definite is that each leading diagonal sub-determinant is greater than $0$. So a positive definite matrix is non-singular.
      2. A necessary and sufficient condition for a symmetric matrix $A$ to be positive definite is that there exists a non-singular matrix $P$ such that $A=PP'$.
  6. Orthogonality.

    A matrix $P$ is said to be orthogonal if $PP'=I$ (or $P'P=I$).

    1. An orthogonal matrix is non-singular.
    2. The determinant of an orthogonal matrix is $\pm1$.
    3. The transpose of an orthogonal matrix is also orthogonal.
    4. The product of two orthogonal matrices is orthogonal.
    5. If $P$ is orthogonal, tr$(P'AP)=$tr$(APP')=$tr$(A)$.
    6. If $P$ is orthogonal, r$(P'AP)=$r$(A)$.
  7. Eigenvalues and eigenvectors.

    Eigenvalues of a square matrix $A$ are defined as the roots of the equation
    $\vert A-\lambda I\vert=0$. The corresponding x satisfying ${\bf x}'(A-\lambda I)=0$ are the eigenvectors.

    1. The eigenvectors corresponding to two different eigenvalues are orthogonal.
    2. The number of non-zero eigenvalues of a square matrix $A$ is equal to the rank of $A$.
  8. Reduction to diagonal form
    1. Given any symmetric $p \times p$ matrix $A$ there exists an orthogonal matrix P such that $P'AP=\Lambda$ where $\Lambda$ is a diagonal matrix whose elements are the eigenvalues of $A$. We write $P'AP=\mbox{diag}(\lambda_1, \ldots, \lambda_p)$.
      1. If $A$ is not of full rank, some of the $\lambda_i$ will be zero.
      2. If $A$ is positive definite (and therefore non-singular), all the $\lambda_i$ will be greater than zero.
      3. The eigenvectors of $A$ form the columns of matrix $P$.
    2. If $A$ is symmetric of rank $r$ and $P$ is orthogonal such that $P'AP=\Lambda$, then
      1. tr $(A)=\sum_{i=1}^{r} \lambda_{i}$ since tr$(A)=$tr$(P'AP)=$tr$(\Lambda)$.
      2. tr $(A^s)=\sum_{i=1}^{r} \lambda_{i}^{s}$.
    3. For every quadratic form $Q={\bf x}'A{\bf x}$ there exists an orthogonal transformation ${\bf x}=P{\bf y}$ which reduces Q to a diagonal quadratic form so that

      \begin{displaymath}Q=\lambda_1y_1^2 + \lambda_2 y_2^2 + \ldots + \lambda_r y_r^2 \end{displaymath}

      where $r$ is the rank of $A$.
  9. Idempotent Matrices.

    A matrix $A$ is said to be idempotent if $A^2=A$. In the following we shall mean symmetric idempotent matrices. Some properties are:

    1. If A is idempotent and non-singular then $A=I$. To prove this, note that $AA=I$ and pre-multiply both sides by $A^{-1}$.
    2. The eigenvalues of an idempotent matrix are either $1$ or $0$.
    3. If $A$ is idempotent of rank $r$, there exists an orthogonal matrix $P$ such that $P'AP=E_r$ where $E_r$ is a diagonal matrix with the first $r$ leading diagonal elements $1$ and the remainder $0$.
    4. If $A$ is idempotent of rank $r$ then tr$(A)=r$. To prove this, note that there is an orthogonal matrix $P$ such that $P'AP=E_r$. Now tr$(P'AP)=$tr$(A)=$tr$(E_r)=r$.
    5. If the ith diagonal element of $A$ is zero, all elements in the ith row and column are zero.
    6. All idempotent matrices not of full rank are positive semi-definite. No idempotent matrix can have negative elements on its diagonal.


next up previous contents
Next: Preliminaries Up: Prerequisites Previous: Probability concepts assumed known   Contents
Bob Murison 2000-10-31