Skip to main content

Solving Ax=bA\mathbf{x}=\mathbf{b} : row reduced form RR

讲座摘要(lecture summary)

When does Ax=bA\mathbf{x}=\mathbf{b} have solutions x,\mathbf{x}, and how can we describe those solutions?

Solvability conditions on b\mathbf{b}

We again use the example:

A=[1222246836810].A=\left[{\begin{array}{r r r r}{1}&{2}&{2}&{2}\\ {2}&{4}&{6}&{8}\\ {3}&{6}&{8}&{10}\end{array}}\right].

The third row of AA is the sum of its first and second rows, so we know that if Ax  =  bA\mathbf{x}\;=\;\mathbf{b} the third component of b\mathbf{b} equals the sum of its first and second components. If b does not satisfy b3=b1+b2b_{3}=b_{1}+b_{2} the system has no solution. If a combination of the rows of AA gives the zero row, then the same combination of the entries of b\mathbf{b} must equal zero.

One way to find out whether Ax=bA\mathbf{x}=\mathbf{b} is solvable is to use elimination on the augmented matrix. If a row of AA is completely eliminated, so is the corresponding entry in b. In our example, row 33 of AA is completely eliminated:

[1222b12468b236810b3]        [1222b10024b22b10000b3b2b1].\begin{array}{r}{\left[\begin{array}{l l l l l}{1}&{2}&{2}&{2}&{b_{1}}\\ {2}&{4}&{6}&{8}&{b_{2}}\\ {3}&{6}&{8}&{10}&{b_{3}}\end{array}\right]\;\rightarrow\;\cdots\;\rightarrow\;\left[\begin{array}{l l l l l}{1}&{2}&{2}&{2}&{b_{1}}\\ {0}&{0}&{2}&{4}&{b_{2}-2b_{1}}\\ {0}&{0}&{0}&{0}&{b_{3}-b_{2}-b_{1}}\end{array}\right].}\end{array}

If Ax=bA\mathbf{x}=\mathbf{b} has a solution, then b3b2b1=0b_{3}-b_{2}-b_{1}=0. For example, we could choose b=[156].\mathbf{b}={\left[\begin{array}{l}{1}\\ {5}\\ {6}\end{array}\right]}.

From an earlier lecture, we know that Ax=bA\mathbf{x}=\mathbf{b} is solvable exactly when b\mathbf{b} is in the column space C(A)C(A). We have these two conditions on b\mathbf{b}; in fact they are equivalent.

Complete solution

In order to find all solutions to Ax  =  bA\mathbf{x}\;=\;\mathbf{b} we first check that the equation is solvable, then find a particular solution. We get the complete solution of the equation by adding the particular solution to all the vectors in the nullspace.

A particular solution

One way to find a particular solution to the equation Ax=bA\mathbf{x}=\mathbf{b} is to set all free variables to zero, then solve for the pivot variables.

For our example matrix AA , we let x2=x4=0x_{2}=x_{4}=0 to get the system of equa tions:

x1+2x3=12x3=3\begin{array}{r c l}{{x_{1}+2x_{3}}}&{{=}}&{{1}}\\ {{2x_{3}}}&{{=}}&{{3}}\end{array}

which has the solution x3=3/2,x1=2x_{3}=3/2,x_{1}=-2 . Our particular solution is:

xp=[203/20].\begin{array}{r}{\pmb{x}_{p}=\left[\begin{array}{r}{-2}\\ {0}\\ {3/2}\\ {0}\end{array}\right].}\end{array}

Combined with the nullspace

The general solution to Ax=bA\mathbf{x}=\mathbf{b} is given by xcomplete=xp+xn,\mathbf{\boldsymbol{x}}_{\mathrm{complete}}=\mathbf{\boldsymbol{x}}_{p}+\mathbf{\boldsymbol{x}}_{n}, where xn\mathbf{x}_{n} is a generic vector i n the nullspace. To see this, we add Axp=bA\mathbf{x}_{p}=\mathbf{b} to Axn=0A\mathbf{x}_{n}=\mathbf{0} and get A(xp+xn)=bA\left(\mathbf{x}_{p}+\mathbf{x}_{n}\right)=\mathbf{b} for every vector xn{\bf x}_{n} in the nullspace.

Last lecture we learned that the nullspace of AA is the collection of all combinations of the special solutions [2100]\left[\begin{array}{r}{-2}\\ {1}\\ {0}\\ {0} \end{array}\right] and [2021]\left[\begin{array}{r}{2}\\ {0}\\ {-2}\\ {1}\end{array}\right]. So the complete solution to the equation Ax=[156]A{\mathbf{x}}={\left[\begin{array}{l}{1}\\ {5}\\ {6}\end{array}\right]} is:

xcomplete=[203/20]+[2100]+c2[2021],\mathbf{x}_{\mathrm{complete}}=\left[\begin{array}{r}{-2}\\ {0}\\ {3/2}\\ {0}\end{array}\right]+\left[\begin{array}{r}{-2}\\ {1}\\ {0}\\ {0}\end{array}\right]+c_{2}\left[\begin{array}{r}{2}\\ {0}\\ {-2}\\ {1}\end{array}\right],

where c1c_{1} and c2c_{2} are real numbers.

The nullspace of AA is a two dimensional subspace of R4\mathbb{R}^{4} , and the solutions to the equation Ax=bA\mathbf{x}=\mathbf{b} form a plane parallel to that through xp=[203/20].x_{p}={\left[\begin{array}{r}{\,\,-2}\\ {\,\,\,\,0}\\ {\,\,\,3/2}\\ {\,\,\,\,0}\end{array}\right]}.

Rank

The rank of a matrix equals the number of pivots of that matrix. If AA is an mm by nn matrix of rank r.r_{.} , we know rmr\leq m and rnr\leq n .

Full column rank

If r=nr=n , then from the previous lecture we know that the nullspace has dimension nr=0n-r=0 and contains only the zero vector. There are no free variables or special solutions.

If Ax=bA\mathbf{x}=\mathbf{b} has a solution, it is unique; there is either 0 or 1 solution. Examples like this, in which the columns are independent, are common in applications.

We know rm,r\,\leq\,m, so if r=nr\,=\,n the number of columns of the matrix is less than or equal to the number of rows. The row reduced echelon form of the matrix will look like R=[I0]\begin{array}{r}{R\,=\,\left[\begin{array}{c}{I}\\ {0}\end{array}\right]}\end{array} For any vector b\mathbf{b} in Rm\mathbb{R}^{m} that’s not a linear combination of the columns of Aˉ\bar{A} , there is no solution to Ax=bA\mathbf{x}=\mathbf{b} .

Full row rank

If r=m.r\,=\,m_{.}, then the reduced matrix R=[IF]R\,=\,{\left[\begin{array}{l l}{I}&{F}\end{array}\right]} has no rows of zeros and so there are no requirements for the entries of b\mathbf{b} to satisfy. The equation Ax=bA\mathbf{x}=\mathbf{b} is solvable for every b\mathbf{b} . There are nr=nmn-r=n-m free variables, so there are nmn-m special solutions to Ax=0A\mathbf{x}=\mathbf{0} .

Full row and column rank

If r=m=nr=m=n is the number of pivots of AA , then AA is an invertible square matrix and RR is the identity matrix. The nullspace has dimension zero, and Ax=bA\mathbf{x}=\mathbf{b} has a unique solution for every b\mathbf{b} in Rm\mathbb{R}^{m} .

Summary

If RR is in row reduced form with pivot columns first (rref), the table below summarizes our results.

r=m=nr=m=nr=n<mr=n<mr=m<nr=m<nr<m,r<nr<m,r<n
RRII[I0]\left[\begin{array}{c}{I}\\ {0}\end{array}\right][IF]\left[\begin{array}{cc}{I} & {F}\end{array}\right][IF00]\left[\begin{array}{cc}{I} & F\\ {0} & 0\end{array}\right]
# solutions to Ax=bA\mathbf{x}=\mathbf{b}1100 or 11infinitely many00 or infinitely many

Problems and Solutions(习题及答案)

Problem 8.1: (3.4 #13.(a,b,d) Introduction to Linear Algebra: Strang) Explain why these are all false:

a) The complete solution is any linear combination of xp\mathbf{x}_{p} and xn\mathbf{x}_{n} .
b) The system Ax=bA\mathbf{x}=\mathbf{b} has at most one particular solution.
c) If AA is invertible there is no solution xn\pmb{x}_{n} in the nullspace.

Solution

a) The coefficient of xp\mathbf{x}_{p} must be one.
b) If xnN(A)\mathbf{x}_{n}\in\mathbf{N}(A) is in the nullspace of AA and xp\mathbf{x}_{p} is one particular solution, then xp+xn{\bf x}_{p}+{\bf x}_{n} is also a particular solution.
c) There’s always xn=0{\bf x}_{n}=0 .

Problem 8.2: (3.4 #28.) Let

U=[123004] and c=[58].U=\left[\begin{array}{l l l}{1}&{2}&{3}\\ {0}&{0}&{4}\end{array}\right]\mathrm{~and~}\mathbf{c}=\left[\begin{array}{l}{5}\\ {8}\end{array}\right].

Use Gauss-Jordan elimination to reduce the matrices [U 0]\left[U\ 0\right] and [Uc]\textbf{[}U\textbf{c}\bigr] to [R 0]\left[R\,\mathrm{~}0\right] and [Rd]\left[R\,\textbf{d}\right] . Solve Rx=0R\mathbf{x}=\mathbf{0} and Rx=dR\mathbf{x}=\mathbf{d} .

Check your work by plugging your values into the equations Ux=0U\mathbf{x}=\mathbf{0} and Ux=cU\mathbf{x}=\mathbf{c} .

Solution

First we transform [U0]\textbf{[}U\textbf{0}\bigr] into [R 0][R\,\ 0] :

[U 0]=[12300040][12300010][12000010]=[R 0].[U~0]={\left[\begin{array}{l l l l}{1}&{2}&{3}&{0}\\ {0}&{0}&{4}&{0}\end{array}\right]}\longrightarrow{\left[\begin{array}{l l l l}{1}&{2}&{3}&{0}\\ {0}&{0}&{1}&{0}\end{array}\right]}\longrightarrow{\left[\begin{array}{l l l l}{1}&{2}&{0}&{0}\\ {0}&{0}&{1}&{0}\end{array}\right]}=[R~0].

We now solve Rx=0R\mathbf{x}=\mathbf{0} via back substitution:

[120001][x1x2x3]=[00][x1+2x2=0x3=0]x=[210],\left[\begin{array}{r r r}{1}&{2}&{0}\\ {0}&{0}&{1}\end{array}\right]\left[\begin{array}{r}{x_{1}}\\ {x_{2}}\\ {x_{3}}\end{array}\right]=\left[\begin{array}{r}{0}\\ {0}\end{array}\right]\longrightarrow\left[\begin{array}{r}{x_{1}+2x_{2}=0}\\ {x_{3}=0}\end{array}\right]\longrightarrow\mathbf{x}=\left[\begin{array}{r}{2}\\ {-1}\\ {0}\end{array}\right],

where we used the free variable x2=1x_{2}=-1 . ( cxc\mathbf{x} is a solution for all cc .) We check that this is a correct solution by plugging it into Ux=0U\mathbf{x}=\mathbf{0} :

[123004][   21   0]=[00] ˇ{\left[\begin{array}{l l l}{1}&{2}&{3}\\ {0}&{0}&{4}\end{array}\right]}{\left[\begin{array}{l}{\ \ \ 2}\\ {-1}\\ {\ \ \ 0}\end{array}\right]}={\left[\begin{array}{l}{0}\\ {0}\end{array}\right]} ~\check{\checkmark}

Next, we transform [Uc]\left[U\,\,\mathfrak{c}\right] into [Rd]\left[R\,\textbf{d}\right] :

[U c]=[12350048][12350012][12010012]=[R d].[U~{\mathsf{c}}]={\left[\begin{array}{l l l}{1}&{2}&{3}&{5}\\ {0}&{0}&{4}&{8}\end{array}\right]}\longrightarrow{\left[\begin{array}{l l l l}{1}&{2}&{3}&{5}\\ {0}&{0}&{1}&{2}\end{array}\right]}\longrightarrow{\left[\begin{array}{l l l l}{1}&{2}&{0}&{-1}\\ {0}&{0}&{1}&{2}\end{array}\right]}=[R~{\mathsf{d}}].

We now solve Rx=dR\mathbf{x}=\mathbf{d} via back substitution:

[120001][x1x2x3]=[12][x1+2x2=1x3=2]x=[312],\left[\begin{array}{c c c}{1}&{2}&{0}\\ {0}&{0}&{1}\end{array}\right]\left[\begin{array}{c c c}{x_{1}}\\ {x_{2}}\\ {x_{3}}\end{array}\right]=\left[\begin{array}{c c c}{-1}\\ {2}\end{array}\right]\longrightarrow\left[\begin{array}{c c c}{x_{1}+2x_{2}=-1}\\ {x_{3}=2}\end{array}\right]\longrightarrow{\pmb x}=\left[\begin{array}{c c c}{-3}\\ {1}\\ {2}\end{array}\right],

where we used the free variable x2=1x_{2}=1 .

Finally, we check that this is the correct solution by plugging it into the equation Ux=cU\mathbf{x}=\mathbf{c} :

[123004][312]=[58] ˇ{\left[\begin{array}{l l l}{1}&{2}&{3}\\ {0}&{0}&{4}\end{array}\right]}{\left[\begin{array}{l}{-3}\\ {1}\\ {2}\end{array}\right]}={\left[\begin{array}{l}{5}\\ {8}\end{array}\right]}~\check{\checkmark}

Problem 8.3: (3.4 #36.) Suppose Ax=bA\mathbf{x}=\mathbf{b} and Cx=bC\mathbf{x}\,=\,\mathbf{b} have the same (complete) solutions for every b. Is it true that A=CA=C?

Solution

Yes. In order to check that A=CA\,=\,C as matrices, it is enough to check that Ay  =  CyA\mathbf{y}\;=\;C\mathbf{y} for all vectors y\textbf{y} of the correct size (or just for the standard basis vectors, since multiplication by them “picks out the columns”). So let y be any vector of the correct size, and set b=Ay\mathbf{b}=A\mathbf{y} . Then y is certainly a solution to Ax=b,A\mathbf{x}=\mathbf{b}, , and so by our hypothesis must also be a solution to Cx=b,C\mathbf{x}=\mathbf{b}, ; in other words, Cy=b=AyC\mathbf{y}=\mathbf{b}=A\mathbf{y}.