Page 568 - Elementary_Linear_Algebra_with_Applications_Anton__9_edition
P. 568
(a) The eigenvalues of A are all real numbers.
(b) Eigenvectors from different eigenspaces are orthogonal.
Proof (a) The proof of part (a), which requires results about complex vector spaces, is discussed in Section 10.6.
Proof (b) Let and be eigenvectors corresponding to distinct eigenvalues and of the matrix A. We want to show
that . The proof of this involves the trick of starting with the expression . It follows from Formula 8 of
Section 4.1 and the symmetry of A that
But is an eigenvector of A corresponding to , and (3)
so 3 yields the relationship is an eigenvector of A corresponding to ,
which can be rewritten as (4)
But , since and were assumed distinct. Thus it follows from 4 that .
Remark We remind the reader that we have assumed to this point that all of our matrices have real entries. Indeed, we shall
see in Chapter 10 that part (a) of Theorem 7.3.2 is false for matrices with complex entries.
Diagonalization of Symmetric Matrices
As a consequence of the preceding theorem we obtain the following procedure for orthogonally diagonalizing a symmetric
matrix.
Step 1. Find a basis for each eigenspace of A.
Step 2. Apply the Gram–Schmidt process to each of these bases to obtain an orthonormal basis for each eigenspace.
Step 3. Form the matrix P whose columns are the basis vectors constructed in Step 2; this matrix orthogonally
diagonalizes A.
The justification of this procedure should be clear: Theorem 7.3.2 ensures that eigenvectors from different eigenspaces are
orthogonal, whereas the application of the Gram–Schmidt process ensures that the eigenvectors obtained within the same
eigenspace are orthonormal. Therefore, the entire set of eigenvectors obtained by this procedure is orthonormal.

