12.11 Annihilator
-
Suppose \(V\) is finite-dimensional and \(U\) is a subspace. Prove that \(U = V \iff U^\circ = \{0\}\).
(This does not depend on \(\dim(V)\).) This is always true.
Proof
\(\Rightarrow\)) Suppose \(U = V\), then by definition:
\(U^{\circ}=\{\varphi\in V^{\ast}:\varphi\left(u\right)=0,\forall u\in U\}\)\(=\{\varphi\in V^{\ast}:\varphi\left(v\right)=0,\forall v\in V\}\)\(\Rightarrow U^\circ = \{0\}\).
\(\Leftarrow\)) Suppose \(U^\circ = \{0\}\). By theorem, we have that \(\dim(V) = \dim(U) + \dim(U^\circ)\)\(\Rightarrow \dim(U) = \dim(V)\)
So \(U \subseteq V\) is a subspace such that \(\dim(U) = \dim(V)\)\(\Rightarrow U = V\)
-
Suppose \(U, W \subseteq V\) are subspaces of \(V\) (finite-dimensional). Prove that \(U \subseteq W \iff W^\circ \subseteq U^\circ\).
\(\Rightarrow\)) If \(U \subseteq W\), take \(\varphi \in W^\circ\): \(\varphi(w) = 0 \; \forall w \in W\)
Suppose \(\varphi \notin U^\circ\) \(\Rightarrow \exists u \in U \text{ such that } \varphi(u) \neq 0\).
Contradiction! Since \(U\subseteq W\). So \(\varphi \in U^\circ\). Therefore \(W^\circ \subseteq U^\circ\).
\(\Leftarrow\)) Suppose \(W^\circ \subseteq U^\circ\). We want to see \(U \subseteq W\).
Suppose by contradiction \(\exists u \in U \text{ such that } u \notin W\).
Note that \(u \neq 0\) (\(0 \in W\), subspace).
So, we can extend \(\{u\}\) to a basis of \(V\): \(B = \{u, u_2, \dots, u_n\}\)
Define \(\varphi \in V^\ast\) by: \(\varphi(u_i) = \begin{cases} 1 & \text{if } u_i = u \\ 0 & \text{otherwise} \end{cases}\)
By construction, \(\varphi \in W^\circ\) since \(\varphi(w) = 0 \; \forall w \in W\).
(I can write any \(w \in W\) as a linear combination of \(\{u_2, \dots, u_n\}\) since \(u \notin W\)).
Since \(W^\circ \subseteq U^\circ\), \(\Rightarrow \varphi \in U^\circ\), \(\Rightarrow\varphi(\tilde{u})=0\;\forall\tilde{u}\in U\).
Contradiction! Since \(\varphi(u)=1\) and \(u \in U\).
Therefore \(U \subseteq W\).
-
Suppose \(V\) is finite-dimensional and \(U\) is a subspace \(\Rightarrow U = \{v \in V : \varphi(v) = 0 \; \forall \varphi \in U^\circ \}\)
Let \(A = \{v \in V : \varphi(v) = 0 \; \forall \varphi \in U^\circ \}\). We wish to see \(U = A\).
\((\subseteq)\) If \(u \in U\), then \(\varphi(u) = 0 \; \forall \varphi \in U^\circ\) (by definition) \(\Rightarrow u \in A.\)Therefore \(U \subseteq A\).
\((\supseteq)\) Suppose by contradiction \(v \in A \setminus U\).
We have that \(v \neq 0\) (\(0 \in U\), subspace always).
Then there must exist a basis of \(V\) in the form \(B = \{u_1, \dots, u_m, v, v_1, \dots, v_n\}\)(\(\{u_1,...,u_m\}\) is basis of \(U\) and \(\{u_1,...,u_m,v\}\)is linearly independent)
Now, consider the dual basis of \(B\): \(B^{\ast}=\{\psi_1,\dots,\psi_{m},\varphi,\varphi_1,\dots,\varphi_{n}\}\)(The \(\varphi\) corresponds to \(v\)).
Let's focus on the element \(\varphi\) in \(B^\ast\).
By definition:
\(\varphi(w) =\begin{cases} 1 & \text{if } w = v \\ 0 & \text{otherwise} \end{cases}\)\(\Rightarrow \varphi(v) = 1 \quad \text{and} \quad \varphi(u) = 0 \; \forall u \in U \; (\varphi(u_i) = 0 \text{ for every } u_i \text{ in my basis for } U)\)
\(\Rightarrow \varphi(v) = 1 \quad \text{and} \quad \varphi \in U^\circ\)
Remember \(A = \{v \in V : \varphi(v) = 0 \; \forall \varphi \in U^\circ\}\), and \(v \in A\). Contradiction!
-
Suppose \(T \in \mathcal{L}(P_5(\mathbb{R}), P_5(\mathbb{R}))\) is such that \(\text{Null}(T^*) = \langle \varphi \rangle\), where \(\varphi\in\left(P_5(\mathbb{R}\right)\), \(\varphi(p) = p(1))\).
Prove that \(\text{Range}(T) = \{ p \in P_5(\mathbb{R}) : p(1) = 0 \}\).
By the last theorem, we know that \(\text{Null}(T^*) = (\text{Range}(T))^\circ \implies (\text{Range}(T))^\circ = \langle \varphi \rangle = \{ \alpha \varphi : \alpha \in \mathbb{R} \}\).
By the last exercise:
\(\text{Range}(T) = \{ p \in P_5(\mathbb{R}) : \varphi(p) = 0, \ \forall \varphi \in (\text{Range}(T))^\circ \}\)
\(= \{ p \in P_5(\mathbb{R}) : \varphi(p) = 0, \ \forall \varphi \in \langle \varphi \rangle \}\)
\(= \{ p \in P_5(\mathbb{R}) : (\alpha \varphi)(p) = 0, \ \forall \alpha \in \mathbb{R} \}\)
\(= \{ p \in P_5(\mathbb{R}) : \alpha p(1) = 0, \ \forall \alpha \in \mathbb{R} \}\)
\(= \{ p \in P_5(\mathbb{R}) : p(1) = 0 \}\)
-
Let \(\mathcal{B} = \{ (1,0), (1,-1) \}\), \(\mathcal{B}' = \{ (0,1), (1,1) \}\) be bases of \(\mathbb{R}^2\).
-
Find the change of basis matrix \(P_{\mathcal{B}}^{\mathcal{B}'}\) from \(\mathcal{B}\) to \(\mathcal{B}'\).
We need to find \([ (1,0) ]_{\mathcal{B}'}\) and \([ (1,-1) ]_{\mathcal{B}'}\), where \((1,0), (1,-1) \in \mathcal{B}\).
Note that
\((1,0) = a(0,1) + b(1,1) \implies b = 1, \ a = -1\)\(\implies [ (1,0) ]_{\mathcal{B}'} = \begin{pmatrix} -1 \\ 1 \end{pmatrix}\)
Similarly, \((1,-1) = c(0,1) + d(1,1) \implies d = 1, \ c = -2\)\(\implies [ (1,-1) ]_{\mathcal{B}'} = \begin{pmatrix} -2 \\ 1 \end{pmatrix}\)
So, the change of basis matrix from \(\mathcal{B}\) to \(\mathcal{B}'\) is given by \(P_{\mathcal{B}}^{\mathcal{B}'} = \begin{pmatrix} -1 & -2 \\ 1 & 1 \end{pmatrix}\) 2. Find the coordinates of \((0,3)\) in \(\mathcal{B}'\).
Note that \((0,3)=3(1,0)+(-3)(1,-1)\)\(\implies[(0,3)]_{\mathcal{B}}=\begin{pmatrix}3\\ -3\end{pmatrix}\)
To find the coordinates in \(\mathcal{B}'\), we do \(P_{\mathcal{B}}^{\mathcal{B}'} [ (0,3) ]_{\mathcal{B}} = \begin{pmatrix} -1 & -2 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 3 \\ -3 \end{pmatrix} = \begin{pmatrix} 3 \\ 0 \end{pmatrix}\)
Coordinates in \(\mathcal{B}'\): \((0,3) = 3(0,1) + 0(1,1)\)
-
-
Let \(\mathcal{B} = \{ (1,-2,1), (2,-3,3), (-2,2,-3) \}\) a basis of \(\mathbb{R}^3\).
-
Find the change of basis matrix from the canonical basis \(\mathcal{C} = \{ e_1, e_2, e_3 \}\) to \(\mathcal{B}\).
Way 1: Find the coordinates \([e_i]_{\mathcal{B}}\).
Way 2: Find the change of basis matrix from \(\mathcal{B}\) to \(\mathcal{C}\) (\(P_{\mathcal{B}}^{\mathcal{C}}\)). Then we have that \((P_{\mathcal{B}}^{\mathcal{C}})^{-1} = P_{\mathcal{C}}^{\mathcal{B}}\).
So, let's do Way 2:
To find \(P_{\mathcal{B}}^{\mathcal{C}}\), we need to find \([v]_{\mathcal{C}}\) for \(v \in \mathcal{B}\).
\([(1, -2, 1)]_{\mathcal{C}} = \begin{pmatrix} 1 \\ -2 \\ 1 \end{pmatrix}, \quad (1, -2, 1) = 1(1, 0, 0) - 2(0, 1, 0) + 1(0, 0, 1)\)
And this is the same for the other elements in \(\mathcal{B}\).
So we get: \(P_{\mathcal{B}}^{\mathcal{C}} = \begin{pmatrix} 1 & 2 & -2 \\ -2 & -3 & 2 \\ 1 & 3 & -3 \end{pmatrix}.\) Then \(P_{\mathcal{C}}^{\mathcal{B}} = \left(P_{\mathcal{B}}^{\mathcal{C}}\right)^{-1} = \begin{pmatrix} 3 & 0 & -2 \\ -4 & -1 & 2 \\ -3 & -1 & 1 \end{pmatrix}.\) 2. Find the coordinates of \((1, 0, 1)\) in \(\mathcal{B}\).We have that \([(1, 0, 1)]_{\mathcal{B}} = P_{\mathcal{C}}^{\mathcal{B}} [(1, 0, 1)]_{\mathcal{C}} = \begin{pmatrix} 3 & 0 & -2 \\ -4 & -1 & 2 \\ -3 & -1 & 1 \end{pmatrix} \begin{pmatrix} 1 \\ 0 \\ 1 \end{pmatrix} = \begin{pmatrix} -1 \\ -2 \\ -2 \end{pmatrix}.\)
Verify: \((1, 0, 1) = 1(1, -2, 1) - 2(-2, -3, 3) - 3(-2, 2, -3)\) 3. Find the coordinates of \((x, y, z) \in \mathbb{R}^3\) in \(\mathcal{B}\).
\([(x, y, z)]_{\mathcal{B}} = P_{\mathcal{C}}^{\mathcal{B}} [(x, y, z)]_{\mathcal{C}}\)\(= \begin{pmatrix} 3 & 0 & -2 \\ -4 & -1 & 2 \\ -3 & -1 & 1 \end{pmatrix} \begin{pmatrix} x \\ y \\ z \end{pmatrix} = \begin{pmatrix} 3x - 2z \\ -4x - y + 2z \\ -3x - y + z \end{pmatrix}.\)
-