Skip to main content

Lecture 5. Transposes, permutations, spaces RnR^n

讲座摘要(lecture summary)

In this lecture we introduce vector spaces and their subspaces.

Permutations

Multiplication by a permutation matrix PP swaps the rows of a matrix; when applying the method of elimination we use permutation matrices to move zeros out of pivot positions. Our factorization A=LUA = LU then becomes PA=LUPA = LU, where PP is a permutation matrix which reorders any number of rows of AA. Recall that P1=PTP^{−1} = P^T, i.e. that PTP=IP^{T}P = I.

Transposes

When we take the transpose of a matrix, its rows become columns and its columns become rows. If we denote the entry in row ii column j of matrix AA by AijA_{ij}, then we can describe ATA^T by: (AT)ij=Aji(A^T)_{ij} = A_{ji}. For example:

[132341]T=[124331]\begin{bmatrix*}[c] 1 & 3 \\ 2 & 3 \\ 4 & 1 \end{bmatrix*}^T = \begin{bmatrix*}[c] 1 & 2 & 4 \\ 3 & 3 & 1 \end{bmatrix*}

AA matrix AA is symmetric if AT=AA^T = A. Given any matrix RR (not necessarily square) the product RTRR^TR is always symmetric, because (RTR)T=RT(RT)T=RTR(R^TR)^T = R^T(R^T)^T = R^TR. (Note that (RT)T=R(R^T)^T = R.)

Vector spaces

We can add vectors and multiply them by numbers, which means we can discuss linear combinations of vectors. These combinations follow the rules of a vector space.

One such vector space is R2\R ^ 2, the set of all vectors with exactly two real number components. We depict the vector [ab]\begin{bmatrix} a \\ b\end{bmatrix} by drawing an arrow from the origin to the point (a,b)(a, b) which is a units to the right of the origin and bb units above it, and we call R2\R^2 the “xyx − y plane”.

Another example of a space is Rn\R^n, the set of (column) vectors with n real number components.

Closure

The collection of vectors with exactly two positive real valued components is not a vector space. The sum of any two vectors in that collection is again in the collection, but multiplying any vector by, say, 5−5, gives a vector that’s not in the collection. We say that this collection of positive vectors is closed under addition but not under multiplication.

If a collection of vectors is closed under linear combinations (i.e. under addition and multiplication by any real numbers), and if multiplication and addition behave in a reasonable way, then we call that collection a vector space.

Subspaces

A vector space that is contained inside of another vector space is called a subspace of that space. For example, take any non-zero vector vv in R2\R^2. Then the set of all vectors cvc\mathbf{v}, where c is a real number, forms a subspace of R2\R2. This collection of vectors describes a line through [00]\begin{bmatrix} 0 \\ 0\end{bmatrix} in R2R^2 and is closed under addition.

A line in R2\R^2 that does not pass through the origin is not a subspace of R2\R^2. Multiplying any vector on that line by 00 gives the zero vector, which does not lie on the line. Every subspace must contain the zero vector because vector spaces are closed under multiplication.

The subspaces of R2\R^2 are:

  1. all of R2\R^2,
  2. any line through [00]\begin{bmatrix} 0 \\ 0\end{bmatrix} and
  3. the zero vector alone (Z).

The subspaces of R3\R^3 are:

  1. all of R3\R^3,
  2. any plane through the origin,
  3. any line through the origin, and
  4. the zero vector alone (Z).

Column space

Given a matrix AA with columns in R3R^3, these columns and all their linear combinations form a subspace of R3R^3. This is the column space C(A)C(A). If A=[132341]A = \begin{bmatrix*}[c] 1 & 3 \\ 2 & 3 \\ 4 & 1 \end{bmatrix*}, the column space of A is the plane through the origin in R3\R^3 containing [124]\begin{bmatrix*}[c] 1 \\ 2 \\ 4 \end{bmatrix*} and [331]\begin{bmatrix*}[c] 3 \\ 3 \\ 1 \end{bmatrix*}.

Our next task will be to understand the equation Ax=bA\mathbf{x} = b in terms of subspaces and the column space of AA.

Problems and Solutions(习题及答案)

Problem 5.1: (2.7 #13. Introduction to Linear Algebra: Strang)

  • a) Find a 33 by 33 permutation matrix with P3=IP^3 = I (but not P=IP = I).
  • b) Find a 44 by 44 permutation P^\hat{P} with P^4I\hat{P}^4 \ne I.

Solution

  • a) Let PP move the rows in a cycle: the first to the second, the second to the third, and the third to the first. So
P=[001100010],P2=[010001100],andP3=I.P = \begin{bmatrix*}[c] 0 & 0 & 1 \\ 1 & 0 & 0 \\ 0 & 1 & 0 \\ \end{bmatrix*}, P^2 = \begin{bmatrix*}[c] 0 & 1 & 0 \\ 0 & 0 & 1 \\ 1 & 0 & 0 \\ \end{bmatrix*}, \text{and} \enspace P^3 = I.
  • b) Let P^\hat{P} be the block diagonal matrix with 1 and P on the diagonal; P^=[100P]\hat{P} = \begin{bmatrix} 1 & 0 \\ 0 & P \end{bmatrix} .Since P3=IP^3 = I, also P^3=I\hat{P}^3 = I. So P^4=P^I\hat{P}^4 = \hat{P} \ne I.

Problem 5.2: Suppose AA is a four by four matrix. How many entries of AA can be chosen independently if:

  • a) AA is symmetric?
  • b) AA is skew-symmetric? (AT=A)(A^T = −A)

Solution

  • a) The most general form of a four by four symmetric matrix is:
A=[aefgebhifhcjgijd].A = \begin{bmatrix*}[c] a & e & f & g\\ e & b & h & i\\ f & h & c & j\\ g & i & j & d \end{bmatrix*}.

Therefore 10\mathbf{10} entries can be chosen independently.

  • b) The most general form of a four by four skew-symmetric matrix is:
A=[0abca0debd0fcef0].A = \begin{bmatrix*}[r] 0 & -a & -b & -c\\ a & 0 & -d & -e\\ b & d & 0 & -f\\ c & e & f & 0 \end{bmatrix*}.

Therefore 6\mathbf{6} entries can be chosen independently.

Problem 5.3: (3.1 #18.) True or false (check addition or give a counterexample):

  • a) The symmetric matrices in MM (with AT=AA^T = A) form a subspace.
  • b) The skew-symmetric matrices in MM (with AT=AA^T = −A) form a subspace.
  • c) The unsymmetric matrices in MM (with ATAA^T \ne A) form a subspace.

Solution

  • a) True: AT=AA^T = A and BT=BB^T = B lead to:
(A+B)T=AT+BT=A+B,and(cA)T=cA.(A + B)^T = A^T + B^T = A + B, \text{and} \enspace (cA)^T = cA.
  • b) True: AT=AA^T = −A and BT=BB^T = −B lead to:
(A+B)T=AT+BT=AB=(A+B),and(cA)T=cA.(A + B)^T = A^T + B^T = −A − B = −(A + B), \text{and} \enspace (cA)^T = −cA.
  • c) False: [1100]+[0011]=[1111]\begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix} + \begin{bmatrix} 0 & 0 \\ 1 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 1 \\ 1 & 1 \end{bmatrix}.