Next: Preliminaries Up: Prerequisites Previous: Probability concepts assumed known   Contents

# Assumed knowledge of matrices and vector spaces

1. Use of terms singular, diagonal, unit, null, symmetric.
2. Operations of addition, subtraction, multiplication, inverse and transpose.

[We will use for the transpose of .]

1. ,
2. .
3. The trace of a matrix , written tr, is defined as the sum of the diagonal elements of A. That is,

1. trtrtr,
2. trtr.
4. Linear Independence and Rank
1. Let , ..., be a set of vectors and , ..., be scalar constants. If only if , the the set of vectors is linearly independent.
2. The rank of a set of vectors is the maximum number of linearly independent vectors in the set.
3. For a square matrix , the rank of , denoted by r, is the maximum order of non-zero subdeterminants.
4. rrrr,
5. Quadratic Forms

For a p-vector x, where , and a square matrix ,

is a quadratic form in

The matrix and the quadratic form are called:

1. positive semidefinite if for all and for some .
2. positive definite if for all .
1. A necessary and sufficient condition for to be positive definite is that each leading diagonal sub-determinant is greater than . So a positive definite matrix is non-singular.
2. A necessary and sufficient condition for a symmetric matrix to be positive definite is that there exists a non-singular matrix such that .
6. Orthogonality.

A matrix is said to be orthogonal if (or ).

1. An orthogonal matrix is non-singular.
2. The determinant of an orthogonal matrix is .
3. The transpose of an orthogonal matrix is also orthogonal.
4. The product of two orthogonal matrices is orthogonal.
5. If is orthogonal, trtrtr.
6. If is orthogonal, rr.
7. Eigenvalues and eigenvectors.

Eigenvalues of a square matrix are defined as the roots of the equation
. The corresponding x satisfying are the eigenvectors.

1. The eigenvectors corresponding to two different eigenvalues are orthogonal.
2. The number of non-zero eigenvalues of a square matrix is equal to the rank of .
8. Reduction to diagonal form
1. Given any symmetric matrix there exists an orthogonal matrix P such that where is a diagonal matrix whose elements are the eigenvalues of . We write .
1. If is not of full rank, some of the will be zero.
2. If is positive definite (and therefore non-singular), all the will be greater than zero.
3. The eigenvectors of form the columns of matrix .
2. If is symmetric of rank and is orthogonal such that , then
1. tr since trtrtr.
2. tr .
3. For every quadratic form there exists an orthogonal transformation which reduces Q to a diagonal quadratic form so that

where is the rank of .
9. Idempotent Matrices.

A matrix is said to be idempotent if . In the following we shall mean symmetric idempotent matrices. Some properties are:

1. If A is idempotent and non-singular then . To prove this, note that and pre-multiply both sides by .
2. The eigenvalues of an idempotent matrix are either or .
3. If is idempotent of rank , there exists an orthogonal matrix such that where is a diagonal matrix with the first leading diagonal elements and the remainder .
4. If is idempotent of rank then tr. To prove this, note that there is an orthogonal matrix such that . Now trtrtr.
5. If the ith diagonal element of is zero, all elements in the ith row and column are zero.
6. All idempotent matrices not of full rank are positive semi-definite. No idempotent matrix can have negative elements on its diagonal.

Next: Preliminaries Up: Prerequisites Previous: Probability concepts assumed known   Contents
Bob Murison 2000-10-31