Page 554 - Elementary_Linear_Algebra_with_Applications_Anton__9_edition
P. 554

The preceding theorem guarantees that an matrix A with n linearly independent eigenvectors is diagonalizable, and the
proof provides the following method for diagonalizing A.

Step 1. Find n linearly independent eigenvectors of A, say , , …, .

Step 2. Form the matrix P having , , …, as its column vectors.

Step 3. The matrix  will then be diagonal with , , …, as its successive diagonal entries, where is the

eigenvalue corresponding to for  .

In order to carry out Step 1 of this procedure, one first needs a way of determining whether a given matrix A has n linearly
independent eigenvectors, and then one needs a method for finding them. One can address both problems at the same time by
finding bases for the eigenspaces of A. Later in this section, we will show that those basis vectors, as a combined set, are linearly
independent, so that if there is a total of n such vectors, then A is diagonalizable, and the n basis vectors can be used as the
column vectors of the diagonalizing matrix P. If there are fewer than n basis vectors, then A is not diagonalizable.

EXAMPLE 1 Finding a Matrix P That Diagonalizes a Matrix A
Find a matrix P that diagonalizes

Solution

From Example 5 of the preceding section, we found the characteristic equation of A to be
and we found the following bases for the eigenspaces:

There are three basis vectors in total, so the matrix A is diagonalizable and

diagonalizes A. As a check, the reader should verify that

There is no preferred order for the columns of P. Since the ith diagonal entry of  is an eigenvalue for the ith column

vector of P, changing the order of the columns of P just changes the order of the eigenvalues on the diagonal of  . Thus,
   549   550   551   552   553   554   555   556   557   558   559