Skip to content

11.12 Linear Independence and Basis

Proposition

Every linear independent list in \(V\), can be extended to a basis

Proof

  1. Prop: \(v_1,...,v_k,w_1,...,w_m\) (\(v_1,...,v_k\) is linear independent and \(w_1,...,w_m\) spans \(V\)), we can remove \(w\) until it is a basis

  2. \(v_1,...,v_k\) is linear independent

    If \((v_1,...,v_{k})=V\checkmark\) , if not, \(0\neq v_{k+1}\in V-\langle v_1,\ldots,v_{k}\rangle\)

Claim

\(v_1,...,v_k,v_{k+1}\) is linear independent.

Let \(0=\lambda_1v_1+\cdots+\lambda_{k}v_{k}+\lambda_{k+1}v_{k+1}\), if \(\lambda_{k+1}=0\), then \(\lambda_1=...=\lambda_k=0\)

If \(\lambda_{k+1}\neq0\Rightarrow-\lambda_{k+1}v_{k+1}=\lambda_1v_1+\cdots+\lambda_{k}v_{k}\Rightarrow v_{k+1}=-\frac{\lambda_1}{\lambda_{k+1}}v_1+\cdots+\left(-\frac{\lambda_{k}}{\lambda_{k+1}}\right)v_{k}\)

Contradiction to if not, 0\neq v_{k+1}\in V-\langle v_1,\ldots,v_{k}\rangle

In this way, we construct linear independent \(v_1,...,v_k,...,v_n\). Theorem implies that this process ends

Theorem

For every subspace \(U\) of \(V\) finite dimension, there exists a subspace \(W\) such that \(V=U\oplus W\)

Notation: \(W\) is direct complement to \(U\)

Proof

\(U\) is finite dimension, so it has a basis: \(u_1,...,u_{k}\). Let \(u_1,...,u_{k},w_1,...,w_{l}\) a basis of \(V\) (prop)

Take \(W=\lang w_1,...,w_l\rang\)

Claim: \(V=U\oplus W\begin{cases}\left(1\right)v=\lambda_1u_1+\cdots+\lambda_{k}u_{k}+\lambda_{k+1}w_1+\cdots+\lambda_{k+l}w_{l}=u+w,\text{with }u\in U,w\in W\\ \left(2\right)v\in U\cap W\Rightarrow v=\lambda_1u_1+\cdots+\lambda_{k}u_{k}=\mu_1w_1+\cdots+\mu_{l}w_{l}\Rightarrow\\ 0=\lambda_1u_1+\cdots+\lambda_{k}u_{k}-\left(\mu_1w_1+\cdots+\mu_{l}w_{l}\right)\end{cases}\)

Hence (because \(u_1,..., u_k, w_1,..., w_l\) is a basis) \(\lambda_1 = \cdots = \lambda_m = \mu_1 = \cdots = \mu_m = 0\). In particular, \(v=0\).

So that \(U \cap W = \{0\}\) and \(V = U \oplus W\).

Remark

Just use basic theorem to prove it is direct sum

We need to prove \(0\) can be written as a unique way of \(U+W\) which is trivial way

We want \(0=u_{}+w,\left(u,w=0\right)\)

Consider \(0=u+w=\lambda_1u_1+...+\lambda_{k}u_{k}+\lambda_{k+1}w_1+...+\lambda_{k+l}w_{l}\)

Since it is a basis, then \(\lambda_1 = \cdots = \lambda_m = \mu_1 = \cdots = \mu_m = 0\)

Then \(u=w=0\)

Theorem

Any two bases of a finite dimension vector space have the same length.

Proof

Consider \(\text{basis}1(m)\) and \(\text{basis}2(n)\) are two linear independent list.

First, we regard \(\text{basis}1\) as a linear independent list and \(\text{basis}2\) be a list. Then we have \(m\leq n\)

Then, we regard \(\text{basis}2\) as a linear independent list and \(\text{basis}1\) be a list. Then we have \(n\leq m\)

theorem: \(\Rightarrow n=m\)

Notation

The dimension of a finite dimension \(V\), is the length of any basis of \(V\). We denote it by \(\dim V\).

Remark

If \(U\hookrightarrow V\)(\(U\) IS SUBSPACE OF \(V\)​), then \(\dim U \leq \dim V\). Moreover, if \(U\subsetneq V\), then \(\dim U < \dim V\).

Theorem

Let \(V\) be of dimension \(\dim V = n\). Then

(a) If \(u_1, \dots, u_n\) is linear independent, then it is a basis.

(b) If \(w_1, \dots, w_n\) is a spanning list, then it is a basis.

a. Since Every linear independent list in V, can be extended to a basis, then since the length of basis is only \(n\),

thus....

b. If it is a linear independent, by a we are done.

If it is a linear dependent, by Propositions then we can remove vectors until it is a linear independent. Then the vectors is less than \(n\), which can not span \(V\). Contradiction!

Sum and direct product

If \(U,W\hookrightarrow V\Rightarrow\dim(U+W)=\dim(U)+\dim(W)-\dim(U\cap W)\)

Sketch: \(v_1,\dots,v_{m}\) a basis of \(U\cap W\begin{cases}v_1,\dots,v_{m},u_1,\dots,u_{k}\text{ a basis of U}\\ v_1,\dots,v_{m},w_1,\dots,w_{\ell}\text{ a basis of W}\end{cases}\)

\(\Rightarrow v_1,\dots,v_{m},u_1,\dots,u_{k},w_1,\dots,w_{\ell}\) is a basis of \(U + W\).

Analysis: We need to prove \(v_1,\dots,v_{m},u_1,\dots,u_{k},w_1,\dots,w_{\ell}\) is a basis of \(U + W\).

Then we need to prove \(v_1,\dots,v_{m},u_1,\dots,u_{k},w_1,\dots,w_{\ell}\) is a linear independent list and spans \(U+W\)

First to prove it spans \(U+W\) is easy.

Second to prove \(v_1,\dots,v_{m},u_1,\dots,u_{k},w_1,\dots,w_{\ell}\) is a linear independent list.

We need to prove \(\lambda_1v_1+\dots+\lambda_{m}v_{m}+\lambda_{m+1}u_1+\dots+\lambda_{m+k}u_{k}+\lambda_{m+k+1}w_1+\dots+\lambda_{m+k+l}w_{\ell}=0\) where \(\lambda_1=...=\lambda_{m+k+l}=0\) (Goal)

We have \(\lambda_1v_1+\cdots+\lambda_{m}v_{m}+\lambda_{m+1}u_1+\cdots+\lambda_{m+k}u_{k}=a\in U\) ()and \(\lambda_{m+k+1}w_1+\cdots+\lambda_{m+k+l}w_{\ell}=b\in W\)( *)

Thus substitute the goal: \(a+b=0\) where \(a\in U\) and \(b\in W\)

Then \(a=-b\), which means \(a,b\in U\cap W\)

Then \(a=\lambda_1v_1+\cdots+\lambda_{m}v_{m}\), minus (*) we have \(\lambda_{m+1}u_1+\cdots+\lambda_{m+k}u_{k}=0\)

Since \(u_{i}\in U\) is a basis, then \(\lambda_{m+1}=...=\lambda_{m+k}=0\)

Also \(b=0v_1+\cdots+0v_{m}\), minus ( ) we have \(\lambda_{m+k+1}w_1+\cdots+\lambda_{m+k+l}w_{\ell}=0\)

Since \(w_{i}\in W\) is a basis, then \(\lambda_{m+k+1}=...=\lambda_{m+k+l}=0\)

Then \(b=0=a\)

Simplify (*), we have \(\lambda_1v_1+\cdots+\lambda_{m}v_{m}=a=0\in U\cap W\) is a basis, then \(\lambda_1=...=\lambda_{m}=0\)

Proposition

Give two finite dimension vector spaces \(V\) and \(W\), \(V \times W\) is a vector space with: \((v, w) + (v', w') = (v + v', w + w')\), \(\lambda (v, w) = (\lambda v, \lambda w)\).

\(V\sim V^{\prime}\hookrightarrow V\times W\), where \(V' = V \times \{0\}\) is a subspace of \(V \times W\).

\(W\sim W^{\prime}\hookrightarrow V\times W\), where \(W' = \{0\} \times W\) is a subspace of \(V \times W\).

Now \(V \times W = V' \oplus W'\).

Remark

\(\dim(V \times W) = \dim V + \dim W\)

\((v_1,0),\dots,(v_{n},0),(0,w_1),\dots,(0,w_{m})\) is a basis of \(V \times W\) iff \(\left\lbrace v_1,\ldots,v_{n}\left\rbrace\right.\right.\)is a basis for \(V\) \(\left\lbrace w_1,\ldots,w_{n}\left\rbrace\right.\right.\)is a basis for \(W\)

Effective computations

Questions

Let \(v_1, \dots, v_m \in \mathbb{F}^n\).

Q:

  1. Are they linearly independent?

  2. Given \(w \in \mathbb{F}^n\), is \(w \in \langle v_1, \dots, v_m \rangle\)?

  3. How to describe \(\langle v_1, \dots, v_m \rangle\) implicitly?

  4. How to choose from \(v_1, \dots, v_m\) a basis?

  5. Which is \(\dim \langle v_1, \dots, v_m \rangle\)?

Given \(A \in M_{m \times n}(\mathbb{F})\), the \(i\)-th row of \(A\) is \(d_i \in \mathbb{F}^n\), \(d_i = (d_{i1}, \dots, d_{in})\).

The row space of \(A\), is \(\langle\alpha_1,\dots,\alpha_{n}\rangle\subseteq\mathbb{F}^{n}\).

Theorem

Row-equivalent matrices have the same row space.

Notation

\(\operatorname{rank} A = \dim(\text{row space of } A) = r\) (number of pivots)

Example

\(W=\langle v_1,v_2,v_3\rangle\subseteq\mathbb{R}^3\), where: \(v_1 = (1, 2, 2, 1)\), \(v_2 = (0, 2, 0, 1)\), \(v_3 = (2, 2, 4, 1)\).

  1. \(\lambda_1 (1, 2, 2, 1) + \lambda_2 (0, 2, 0, 1) + \lambda_3 (2, 2, 4, 1) = (0, 0, 0, 0)\) \(\Rightarrow\) \(\begin{pmatrix}\begin{array}{ccc|c}1 & 0 & 2 & 0\\ 2 & 2 & 2 & 0\\ 2 & 0 & 4 & 0\\ 1 & 1 & 1 & 0\end{array}\end{pmatrix}\)

  2. \(w=(a,b,c,d):\)\(\begin{pmatrix}\begin{array}{ccc|c}1 & 0 & 2 & a\\ 2 & 2 & 2 & b\\ 2 & 0 & 4 & c\\ 1 & 1 & 1 & d\end{array}\end{pmatrix}\)

  3. \(\begin{pmatrix}\begin{array}{ccc|c}1 & 0 & 2 & a\\ 2 & 2 & 2 & b\\ 2 & 0 & 4 & c\\ 1 & 1 & 1 & d\end{array}\end{pmatrix}\sim\begin{pmatrix}\begin{array}{ccc|c}1 & 0 & 2 & a\\ 0 & 2 & -2 & b-2a\\ 0 & 0 & 0 & c-2a\\ 0 & 1 & -1 & d-a\end{array}\end{pmatrix}\sim\begin{pmatrix}\begin{array}{ccc|c}1 & 0 & 2 & a\\ 0 & 1 & -1 & d-a\\ 0 & 0 & 0 & b-2d\\ 0 & 0 & 0 & c-2a\end{array}\end{pmatrix}\)

    Q1: NO. \(\lambda_3 = 1, \lambda_2 = 1, \lambda_1 = -2\)

    Q2: \(w \in \langle v_1, v_2, v_3 \rangle\) iff \(b = 2d, c = 2a\)

    Q3: \(W = \{ (a, b, c, d) \in \mathbb{R}^4 : b = 2d, c = 2a \}\)

    Q4: Choose the ones corresponding to pivots.

    Since we can choose the vector that we wanted, thus we can remove \(\lambda_3\), thus there is 2 pivots in 2 column, trivial solution! Thus it is linear independent, thus it is basis

    Q5: \(\dim W = r\)