Page 501 - Elementary_Linear_Algebra_with_Applications_Anton__9_edition
P. 501
THEOREM 6.4.3
If A is an matrix, then the following are equivalent.
(a) A has linearly independent column vectors.
(b) is invertible.
Proof We shall prove that and leave the proof that as an exercise.
Assume that A has linearly independent column vectors. The matrix has size , so we can prove that this
matrix is invertible by showing that the linear system has only the trivial solution. But if x is any solution of this
system, then is in the nullspace of and also in the column space of A. By Theorem 6.2.6 these spaces are orthogonal
complements, so part (b) of Theorem 6.2.5 implies that . But A has linearly independent column vectors, so by
Theorem 5.6.8.
The next theorem is a direct consequence of Theorems Theorem 6.4.2 and Theorem 6.4.3. We omit the details.
THEOREM 6.4.4
If A is an matrix with linearly independent column vectors, then for every matrix b, the linear system
has a unique least squares solution. This solution is given by
Moreover, if W is the column space of A, then the orthogonal projection of b on W is (4)
(5)
Remark Formulas 4 and 5 have various theoretical applications, but they are very inefficient for numerical calculations.
Least squares solutions of are typically found by using Gaussian elimination to solve the normal equations, and the
orthogonal projection of b on the column space of A, if needed, is best obtained by computing , where x is the least
squares solution of . The -decomposition of A is also used to find least squares solutions of .
EXAMPLE 1 Least Squares Solution given by
Find the least squares solution of the linear system

