Skip to content

未命名

We want to prove that:
\(\langle W, V \rangle = \| V \| \cdot \| W \| \cdot \cos(\theta)\)

image

Case 1: \((\mathbb{R}^2)\)

\(V = \begin{pmatrix} 1 \\ 0 \end{pmatrix}, W = \begin{pmatrix} a \\ b \end{pmatrix}\), \(a^2 + b^2 = 1\)

\(\langle W, V \rangle = a = \cos(\theta)\) (by definition of \(\cos\))

image

Case 2: \(V, W \in \mathbb{R}^2\), \(\| V \| = \| W \| = 1\)

Let \(A = \begin{pmatrix} \cos(\alpha) & -\sin(\alpha) \\ \sin(\alpha) & \cos(\alpha) \end{pmatrix}\) be a rotation matrix.

image

Then \(V = A \begin{pmatrix} 1 \\ 0 \end{pmatrix}\), put \(W \in \mathbb{R}^2\) such that \(W = A W_0\)

Now:
\(\langle W, V \rangle = \langle A W_0, A \begin{pmatrix} 1 \\ 0 \end{pmatrix} \rangle\)
\(= (A W_0)^T A \begin{pmatrix} 1 \\ 0 \end{pmatrix}\)
\(= W_0^T A^T A \begin{pmatrix} 1 \\ 0 \end{pmatrix}\)
\(= W_0^T \begin{pmatrix} 1 \\ 0 \end{pmatrix} = \langle W_0, \begin{pmatrix} 1 \\ 0 \end{pmatrix} \rangle\)
\(= \cos(\theta)\) (by using Case 1)

Case 3: \(V, W \in \mathbb{R}^2\), any length.

\(W_0 = \frac{W}{\| W \|}, V_0 = \frac{V}{\| V \|}\)

image

Then \(\| W_0 \| = 1, \| V_0 \| = 1\)

Applying Case 2, we get: \(\langle W_0, V_0 \rangle = \cos(\theta)\)

\(\langle\frac{W}{\|W\|}\cdot\frac{V}{\|V\|}\rangle=\cos(\theta)\)

\(\frac{1}{\| W \|} \cdot \frac{1}{\| V \|} \cdot \langle W, V \rangle = \cos(\theta)\)

Thus, \(\langle W, V \rangle = \| W \| \cdot \| V \| \cdot \cos(\theta)\) for all \(V, W \in \mathbb{R}^2\)

The same idea works in \(\mathbb{R}^n\)

Consequences:

  1. Cauchy-Schwartz inequality: \(V, W \in \mathbb{R}^n\), \(|\langle V, W \rangle| \leq \| V \| \cdot \| W \|\)

  2. Triangular inequality: \(V, W \in \mathbb{R}^n\)
    \(\| V + W \| \leq \| V \| + \| W \|\)

Equivalently:
\(\| V + W \|^2 \leq (\| V \| + \| W \|)^2\)

\(\| V + W \|^2 = \langle V + W, V + W \rangle = \langle V, V \rangle + 2 \langle V, W \rangle + \langle W, W \rangle\)

\(= \| V \|^2 + 2 \langle V, W \rangle + \| W \|^2\)
\(\leq \| V \|^2 + 2 \| V \| \| W \| + \| W \|^2\)
\(= (\| V \| + \| W \|)^2\)

image

Projections

image

Find \(\lambda \in \mathbb{R}\) such that \((\lambda V - W) \perp V\)

Put \(\lambda=\langle W,\frac{V}{\left|\left|V\right|\right|^2}\rangle\) works.

To check this, compute:
\(\langle\lambda V-W,V\rangle=\langle\langle W,\frac{V}{\left|\left|V\right|\right|^2}\rangle V-W,V\rangle\)
\(=\langle\langle W,\frac{V}{\left|\left|V\right|\right|^2}\rangle V,V\rangle-\langle W,V\rangle\)
\(=\langle W,\frac{V}{\left|\left|V\right|\right|^2}\rangle\langle V,V\rangle-\langle W,V\rangle\)
\(= \langle W, V \rangle - \langle W, V \rangle = 0\)

We can define the projection:

\(\text{Proj}_{V}(W)=\langle W,\frac{V}{\left|\left|V\right|\right|}\rangle\frac{V}{\left|\left|V\right|\right|}\)

Linear Systems of Equations

\(A \in \mathbb{R}^{m \times n}\), \(X \in \mathbb{R}^n\), \(X=\begin{pmatrix}x_1\\ x_2\\ \vdots\\ x_{n}\end{pmatrix}\), \(b \in \mathbb{R}^m\), \(b = \begin{pmatrix} b_1 \\ b_2 \\ \vdots \\ b_m \end{pmatrix}\)

The problem to solve is: \(A \cdot X = b\)

\(\begin{pmatrix}a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{m1} & a_{m2} & \cdots & a_{mn}\end{pmatrix}\begin{pmatrix}x_1\\ x_2\\ \vdots\\ x_{n}\end{pmatrix}=\begin{pmatrix}b_1\\ b_2\\ \vdots\\ b_{m}\end{pmatrix}\)

Equivalent form:

\(a_{11} x_1 + a_{12} x_2 + \cdots + a_{1n} x_1 = b_1, \\ a_{21} x_1 + a_{22} x_2 + \cdots + a_{2n} x_2 = b_2, \\\dots \\ a_{m1} x_1 + a_{m2} x_2 + \cdots + a_{mn} x_n = b_m\)

Some examples:

  1. \(\begin{aligned} 2 x_1 + 2 x_2 &= 2 \\ x_1 + 2 x_2 &= 3 \end{aligned}\)

  2. \(\begin{aligned} 2 x_1 + x_2 + x_3 &= 0 \\ -x_1 + x_2 + x_3 &= 1 \\ x_1 - x_2 + 2 x_3 &= 2 \end{aligned}\)

Problem 1: Write it in the matrix form: \(\begin{pmatrix} 2 & 2 \\ 1 & 2 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \end{pmatrix} = \begin{pmatrix} 2 \\ 3 \end{pmatrix}\)

\(A \cdot X = b\), \(A^{-1} \cdot A \cdot X = A^{-1} \cdot b\), \(X = A^{-1} \cdot b\)

\(X=\begin{pmatrix}1 & -1\\ -\frac12 & 1\end{pmatrix}\begin{pmatrix}2\\ 3\end{pmatrix}=\begin{pmatrix}-1\\ 2\end{pmatrix}\)

Note that \(A = \begin{pmatrix} 2 & 2 \\ 1 & 2 \end{pmatrix}\), \(\det(A) = 2 \neq 0\)

There is an inverse \(A^{-1} = \frac{1}{2} \begin{pmatrix} 2 & -2 \\ -1 & 2 \end{pmatrix}\)

\(A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\), \(\det(A) \neq 0\), then: \(A^{-1} = \frac{1}{\det(A)} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\)

Check: \(2(-1) + 2(2) = 2 \quad \checkmark\), \(-1 + 2(2) = 3 \quad \checkmark\)

Gauss Elimination

In problem 2:

(Ec. 3) - \(\left(\frac{1}{2}\right)\) Ec. 1:

\(x_1 - x_2 + 2 x_3 = 2\) minus \(\frac{1}{2}(2 x_1 + x_2 + x_3) = 0\)

we have \(0 - \frac{3}{2} x_2 + \frac{3}{2} x_3 = 2\)

This is the same as we did with Row operations.

We will proceed as follows, similarly as in the method to find the inverse of a matrix:

\(\begin{pmatrix}{cc|cc} 2 & 1 & 1 \\ -1 & 1 & 1 \\ 1 & -1 & 2 \end{pmatrix} \begin{pmatrix} 0 \\ 1 \\ 2 \end{pmatrix}\)

\(R_2 + \frac{1}{2}R_1 \rightarrow R_2\), \(R_3 - \frac{1}{2}R_1 \rightarrow R_3\)

\(\left(\begin{array}{ccc|cc}2 & 1 & 1 & 0\\ 0 & \frac32 & \frac32 & 1\\ 0 & -\frac32 & \frac32 & 2\end{array}\right)\)

\(R_3 + R_2 \rightarrow R_3\)

\(\left(\begin{array}{ccc|cc}2 & 1 & 1 & 0\\ 0 & \frac32 & \frac32 & 1\\ 0 & 0 & 3 & 3\end{array}\right)\)

\(2x_1 + x_2 + x_3 = 0\), \(\frac{3}{2} x_2 + \frac{3}{2} x_3 = 1\), \(3 x_3 = 3 \implies x_3 = 1\)

\(\frac{3}{2}x_2 + \frac{3}{2} = 1\)

\(x_2 = \frac{2}{3} - \frac{1}{2} = -\frac{1}{3}\)

\(X_2 = -\frac{1}{3}\)

\(2x_1 + \frac{-1}{3} + 1 = 0\)

\(2x_1 = -1 + \frac{1}{3} = -\frac{2}{3}\)

\(X_1 = -\frac{1}{3}\)

The solution to \(A \cdot X = b\) is \(X = \begin{pmatrix} -\frac{1}{3} \\ -\frac{1}{3} \\ 1 \end{pmatrix}\)

Note that \(A^{-1} = \begin{pmatrix} \frac{1}{3} & -\frac{1}{3} & 0 \\ \frac{1}{3} & \frac{1}{3} & -\frac{1}{3} \\ 0 & \frac{1}{3} & \frac{1}{3} \end{pmatrix}\)

We can verify that \(A^{-1} \begin{pmatrix} 0 \\ 1 \\ 2 \end{pmatrix} = \begin{pmatrix} -\frac{1}{3} \\ -\frac{1}{3} \\ 1 \end{pmatrix}\) \(\checkmark\)

More examples

  1. \(x_1 + 2x_2 + x_3 = 1 \\ -x_1 + x_2 + x_3 = 1 \\ -x_1 + 4x_2 + 3x_3 = 2\)

Matrix form: \(\begin{pmatrix}1 & 2 & 1\\ -1 & 1 & 1\\ -1 & 4 & 3\end{pmatrix}X=\begin{pmatrix}1\\ 1\\ 2\end{pmatrix}\)

To find the solution: Gauss elimination. \(\left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ -1 & 1 & 1 & 1 \\ -1 & 4 & 3 & 2 \end{array}\right)\)

Step 1: \(R_2 + R_1 \rightarrow R_2\), \(R_3 + R_1 \rightarrow R_3\) \(\rightarrow \left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ 0 & 3 & 2 & 2 \\ 0 & 6 & 4 & 3 \end{array}\right)\)

Step 2: \(R_3 - 2R_2 \rightarrow R_3\) \(\rightarrow \left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ 0 & 3 & 2 & 2 \\ 0 & 0 & 0 & -1 \end{array}\right)\)

Then \(0 \cdot x_3 = -1\). The system has no solution.

  1. The same matrix, different \(b\): \(A \cdot X = b\), \(b = \begin{pmatrix} 1 \\ 1 \\ 3 \end{pmatrix}\)

Again, Gauss elimination: \(\left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ -1 & 1 & 1 & 1 \\ -1 & 4 & 3 & 3 \end{array}\right)\)\(\rightarrow \left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ 0 & 3 & 2 & 2 \\ 0 & 6 & 4 & 4 \end{array}\right)\)\(\rightarrow \left(\begin{array}{ccc|c} 1 & 2 & 1 & 1 \\ 0 & 3 & 2 & 2 \\ 0 & 0 & 0 & 0 \end{array}\right)\)

\(R_3=R_1+2R_2\)