Page 97 - Elementary_Linear_Algebra_with_Applications_Anton__9_edition
P. 97
Proof Since , it follows that is a solution of . To show that this is the only solution, we will assume
that is an arbitrary solution and then show that must be the solution .
If is any solution, then . Multiplying both sides by , we obtain .
EXAMPLE 1 Solution of a Linear System Using
Consider the system of linear equations
In matrix form this system can be written as , where
In Example 4 of the preceding section, we showed that A is invertible and
By Theorem 1.6.2, the solution of the system is
or , ,.
Remark
Note that the method of Example 1 applies only when the system has as many equations as unknowns and the coefficient matrix is
invertible. This method is less efficient, computationally, than Gaussian elimination, but it is important in the analysis of equations
involving matrices.
Linear Systems with a Common Coefficient Matrix
Frequently, one is concerned with solving a sequence of systems
each of which has the same square coefficient matrix A. If A is invertible, then the solutions
can be obtained with one matrix inversion and k matrix multiplications. Once again, however, a more efficient method is to form the
matrix
(1)
in which the coefficient matrix A is “augmented” by all k of the matrices , , …, , and then reduce 1 to reduced row-echelon
form by Gauss–Jordan elimination. In this way we can solve all k systems at once. This method has the added advantage that it
applies even when A is not invertible.

