Skip to content

9.10 Matrices

Particular case

\(A \in \mathbb{R}^{n \times 1}\) (m rows, 1 column) \(A=\begin{pmatrix}a_{11}\\ a_{21}\\ \vdots\\ a_{n1}\end{pmatrix}\)

We identify matrices in \(\mathbb{R}^{n \times 1}\) with vectors in \(\mathbb{R}^n\).

Example

Let \(A \in \mathbb{R}^{3 \times 3}\) such that \((A)_{ij} = i + j\), where \(1 \leq i \leq 3\), \(1 \leq j \leq 3\), then \(A = \begin{pmatrix} 2 & 3 & 4 \\ 3 & 4 & 5 \\ 4 & 5 & 6 \end{pmatrix}\)

Operations

Sum of matrices of the same size \(m \times n\): If \(A, B \in \mathbb{R}^{m \times n}\), the sum \(A + B\) is a matrix in \(\mathbb{R}^{m \times n}\) defined by: \((A + B)_{ij} = a_{ij} + b_{ij}\)

Coordinate-wise operation

Example

If \(A = \begin{pmatrix} 2 & 3 \\ 5 & 6 \\ 1 & 0 \end{pmatrix}\) and \(B = \begin{pmatrix} -1 & 0 \\ 2 & -1 \\ 7 & 1 \end{pmatrix}\), then \(A + B = \begin{pmatrix} 1 & 3 \\ 7 & 5 \\ 8 & 1 \end{pmatrix}\)

Product by scalars

If \(\lambda \in \mathbb{R}\), \(A \in \mathbb{R}^{m \times n}\), then \((\lambda A)_{ij} = \lambda a_{ij}\)

Properties

  • \(A + B = B + A\)
  • \(\lambda (A + B) = \lambda A + \lambda B\)
  • \((\lambda \mu) A = \lambda A + \mu A\)
  • \(\lambda (\mu A) = (\lambda \mu) A\)

Product of matrix times vector

Let \(A \in \mathbb{R}^{m \times n}\), \(v \in \mathbb{R}^n\). The product \(A \cdot V\) will be a vector \(W \in \mathbb{R}^m\), defined as \(W = \begin{pmatrix} W_1 \\ W_2 \\ \vdots \\ W_m \end{pmatrix}\) with \(W_k = \sum_{j=1}^{n} a_{kj} v_j\) for \(1 \leq k \leq m\) So: \(W_k = a_{k1} v_1 + a_{k2} v_2 + a_{k3} v_3 + \dots + a_{kn} v_n\).

Example 1

Let \(A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \in \mathbb{R}^{3 \times 2}\), and \(V = \begin{pmatrix} 1 \\ 2 \end{pmatrix} \in \mathbb{R}^2\).Then: \(A \cdot V = \begin{pmatrix} 1 & 2 \\ 3 & 4 \\ 5 & 6 \end{pmatrix} \begin{pmatrix} 1 \\ 2 \end{pmatrix} = \begin{pmatrix} 5 \\ 11 \\ 17 \end{pmatrix} \in \mathbb{R}^{3 \times 1}\).

Example 2

Let \(B = \begin{pmatrix} 2 & 0 \\ 0 & -1 \end{pmatrix} \in \mathbb{R}^{2 \times 2}\), and \(V = \begin{pmatrix} 5 \\ 3 \end{pmatrix} \in \mathbb{R}^{2 \times 1}\). Then: \(B \cdot V = \begin{pmatrix} 2 & 0 \\ 0 & -1 \end{pmatrix} \begin{pmatrix} 5 \\ 3 \end{pmatrix} = \begin{pmatrix} 10 \\ -3 \end{pmatrix} \in \mathbb{R}^{2 \times 1}\).

Note

This operation allows to define functions \(T: \mathbb{R}^n \rightarrow \mathbb{R}^m\).

\(\mathbb{R}^{n}=\left\{v=\begin{pmatrix}v_1\\ \vdots\\ v_{m}\end{pmatrix}:v_{j}\in\mathbb{R}\ \forall j=1,\dots,n\right\}\)

\(\mathbb{R}^m = \left\{ v = \begin{pmatrix} v_1 \\ \vdots \\ v_m \end{pmatrix} : v_j \in \mathbb{R} \ \forall j = 1, \dots, m \right\}\)

We can define, given a matrix \(A \in \mathbb{R}^{m \times n}\), a function \(T: \mathbb{R}^n \rightarrow \mathbb{R}^m\).

image

\(T(v) = A \cdot v\)

Toy example: \(A \in \mathbb{R}^{1 \times 1}\), \(A = a \in \mathbb{R}\) \(T: \mathbb{R} \rightarrow \mathbb{R}\), \(T(x) = a \cdot x\)

image

Properties

\(A \in \mathbb{R}^{m \times n}\), \(v, w \in \mathbb{R}^n\), \(\lambda \in \mathbb{R}\)

  1. \(A \cdot (v + w) = A \cdot v + A \cdot w\)

  2. \(A \cdot (\lambda v) = \lambda A \cdot v\)

  3. Consequence \(v, w \in \mathbb{R}^n\), \(\alpha, \beta \in \mathbb{R}\)

    \(A \cdot (\alpha v + \beta w) = \alpha A \cdot v + \beta A \cdot w\)

Proof
  1. \(A \cdot (v + w) \in \mathbb{R}^m\)

    \(A \cdot (v + w) = \begin{pmatrix} c_1 \\ c_2 \\ \vdots \\ c_m \end{pmatrix}\)

    \(A \cdot v + A \cdot w \in \mathbb{R}^m\)

    Fix \(k\), \(1 \leq k \leq m\), compute \(c_k\): \((A\cdot(v+w))_{k}=\sum_{j=1}^{n}a_{kj}(v_{j}+w_{j})=\sum_{j=1}^{n}a_{kj}v_{j}+\sum_{j=1}^{n}a_{kj}w_{j}\)

    Thus, \(= (A \cdot v)_k + (A \cdot w)_k\)

  2. \(A\cdot(\lambda v)\in\mathbb{R}^{m}\)

    \(A\cdot(\lambda v)=\begin{pmatrix}c_1\\ c_2\\ \vdots\\ c_{m}\end{pmatrix}\)

    \(\lambda Av\in\mathbb{R}^{m}\)

    Fix \(k\), \(1 \leq k \leq m\), compute \(c_k\): \((A\cdot(\lambda v))_{k}=\sum_{j=1}^{n}a_{kj}(\lambda v_{j})=\lambda\sum_{j=1}^{n}a_{kj}v_{j}=\lambda Av\)

Product of matrices

Let \(A \in \mathbb{R}^{m \times n}\), \(B \in \mathbb{R}^{n \times k}\).

\(A = (a_{ij})_{1 \leq i \leq m, 1 \leq j \leq n}\), \(B = (b_{ij})_{1 \leq i \leq n, 1 \leq j \leq k}\).

The matrix \(A \cdot B\) will be a matrix in \(\mathbb{R}^{m \times k}\) defined by: \((A\cdot B)_{ij}=\sum_{l=1}^{n}a_{il}b_{lj}\).

image

Example

Let \(A = \begin{pmatrix} 1 & -1 \\ 0 & 1 \\ 2 & 1 \end{pmatrix} \in \mathbb{R}^{3 \times 2}\) and \(B = \begin{pmatrix} 1 & 1 & 0 & 4 \\ -1 & 2 & -1 & 1 \end{pmatrix} \in \mathbb{R}^{2 \times 4}\)

\(A \cdot B\in\R^{3\times4}\)

\(A \cdot B = \begin{pmatrix} 1 & -1 \\ 0 & 1 \\ 2 & 1 \end{pmatrix} \begin{pmatrix} 1 & 1 & 0 & 4 \\ -1 & 2 & -1 & 1 \end{pmatrix} = \begin{pmatrix} 2 & -1 & 1 & 3 \\ -1 & 2 & -1 & 1 \\ 1 & 4 & -1 & 9 \end{pmatrix}\).

Remark

We cannot compute \(B \cdot A\).

The product \(A \cdot B\) can only be computed if \(A \in \mathbb{R}^{m \times n}\), \(B \in \mathbb{R}^{n \times k}\).

Example

Let \(A = \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix}\) and \(B = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix}\), where \(A \in \mathbb{R}^{2 \times 2}\), \(B \in \mathbb{R}^{2 \times 2}\).

We can compute \(A \cdot B\) and \(B \cdot A\):

\(A \cdot B = \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} = \begin{pmatrix} 1 & 1 \\ 1 & 0 \end{pmatrix} \in \mathbb{R}^{2 \times 2}\).

\(B \cdot A = \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 1 & 1 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} \in \mathbb{R}^{2 \times 2}\).

In general, \(A \cdot B \neq B \cdot A\).

Example

Consider \(\alpha, \beta \in \mathbb{R}\), \(\alpha, \beta \in [0, 2\pi)\).

Define the matrix \(R_\alpha \in \mathbb{R}^{2 \times 2}\):

\(R_\alpha = \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix}\).

\(R_\pi = \begin{pmatrix} -1 & 0 \\ 0 & -1 \end{pmatrix}\).

Compute: \(R_\alpha \cdot R_\beta = R_{\alpha + \beta}\).

\(R_\alpha \cdot R_\beta = \begin{pmatrix} \cos \alpha & -\sin \alpha \\ \sin \alpha & \cos \alpha \end{pmatrix} \begin{pmatrix} \cos \beta & -\sin \beta \\ \sin \beta & \cos \beta \end{pmatrix}\)

\(=\begin{pmatrix}\cos\alpha\cos\beta-\sin\alpha\sin\beta & -\cos\alpha\sin\beta-\sin\alpha\cos\beta\\ \sin\alpha\cos\beta+\cos\alpha\sin\beta & -\sin\alpha\sin\beta+\cos\alpha\cos\beta\end{pmatrix}\)

\(= \begin{pmatrix} \cos (\alpha + \beta) & -\sin (\alpha + \beta) \\ \sin (\alpha + \beta) & \cos (\alpha + \beta) \end{pmatrix}\).

Transpose of a matrix

Let \(A \in \mathbb{R}^{m \times n}\) be a matrix. We define the "Transpose" of \(A\) by \(A^T \in \mathbb{R}^{n \times m}\), \(A=(a_{ij})_{1\leq i\leq m,1\leq j\leq n}\)

\((A^T)_{ij} = a_{ji}\)

Example

\(A = \begin{pmatrix} 2 & 3 & -1 \\ 1 & 0 & 4 \end{pmatrix} \in \mathbb{R}^{3 \times 2}\), \(A^T \in \mathbb{R}^{2 \times 3}\)

\(A^T = \begin{pmatrix} 2 & 1 \\ 3 & 0 \\ -1 & 4 \end{pmatrix}\)

\(B = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} \in \mathbb{R}^{2 \times 2}\)

\(B^T = \begin{pmatrix} 1 & 3 \\ 2 & 4 \end{pmatrix}\)

Property

Let \(A \in \mathbb{R}^{m \times n}\), \(B \in \mathbb{R}^{n \times k}\). We can compute \(A \cdot B \in \mathbb{R}^{m \times k}\). Then, \((A \cdot B)^T \in \mathbb{R}^{k \times m}\). Then, \((A \cdot B)^T = B^T \cdot A^T\).

For the proof

Check that: \((A \cdot B)^T_{ij} = (B^T \cdot A^T)_{ij}\).

Fix \(1 \leq i \leq k\), \(1 \leq j \leq m\). Then: \((A \cdot B)^T_{ij} =(A\cdot B)_{ji} = \sum_{l=1}^{n} A_{jl} B_{li}\)\(= \sum_{l=1}^{n} (A^T)_{lj} (B^T)_{il}\)\(= \sum_{l=1}^{n} (B^T)_{il} (A^T)_{lj}\)\(= B^T \cdot A^T\)

Identity matrix

Let \(I_n \in \mathbb{R}^{n \times n}\) be the matrix defined by \((I_n)_{ij} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{if } i \neq j \end{cases}\)

\(I_2 = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}~~~I_3 = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{pmatrix}\)

Exercise

Let \(A\in\mathbb{R}^{n\times n}\), then \(I_{n}\cdot A=A\cdot I_{n}=A\).

Inverse matrix

Let \(A \in \mathbb{R}^{n \times n}\), we say that \(A\) is invertible if there exists \(A^{-1} \in \mathbb{R}^{n \times n}\) such that \(A \cdot A^{-1} = I = A^{-1} \cdot A\)

Example

Let \(A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}\). Check that \(A^{-1} = \begin{pmatrix} -2 & 1 \\ \frac{3}{2} & -\frac{1}{2} \end{pmatrix}\).

Let's check: \(A^{-1} \cdot A = \begin{pmatrix} -2 & 1 \\ \frac{3}{2} & -\frac{1}{2} \end{pmatrix} \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}\)

\(A\cdot A^{-1}=\begin{pmatrix}1 & 2\\ 3 & 4\end{pmatrix}\begin{pmatrix}-2 & 1\\ \frac32 & -\frac12\end{pmatrix}=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}\)

Example

\(A = \begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix}\), suppose that there is an inverse \(A^{-1} = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\)

Suppose that \(\begin{pmatrix} 1 & 2 \\ 2 & 4 \end{pmatrix} \begin{pmatrix} a & b \\ c & d \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}\)

Then \(a + 2c = 1\), \(2a + 4c = 0\)

Contradiction! \(A\) has no inverse!

How to find an inverse (at least in \(\mathbb{R}^{2 \times 2}\))

Consider the matrix \(A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\).

First: If \(a = c = 0 \implies\) No inverse!

Suppose \(a \neq 0\). Consider the matrix \(E_1 = \begin{pmatrix} 1 & 0 \\ -\frac{c}{a} & 1 \end{pmatrix}\)

Compute \(\begin{aligned} E_1 \cdot A &= \begin{pmatrix} 1 & 0 \\ -\frac{c}{a} & 1 \end{pmatrix} \cdot \begin{pmatrix} a & b \\ c & d \end{pmatrix} = \begin{pmatrix} a & b \\ 0 & d - \frac{bc}{a} \end{pmatrix} = \begin{pmatrix} a & b \\ 0 & \frac{ad - bc}{a} \end{pmatrix} \end{aligned}\)

Define \(E_2=\begin{pmatrix}1 & -\frac{ba}{ad-bc}\\ 0 & 1\end{pmatrix}\), if \(ad-bc\neq 0\)

Compute \(\begin{aligned} E_2 \cdot E_1 \cdot A &= \begin{pmatrix} 1 & -\frac{b}{ad - bc} \\ 0 & 1 \end{pmatrix} \cdot \begin{pmatrix} a & b \\ 0 & \frac{ad - bc}{a} \end{pmatrix} = \begin{pmatrix} a & 0 \\ 0 & \frac{ad - bc}{a} \end{pmatrix} \end{aligned}\)

Last step: Define \(E_3 = \begin{pmatrix} \frac{1}{a} & 0 \\ 0 & \frac{a}{ad - bc} \end{pmatrix}\)

Compute \(\begin{aligned} E_3 \cdot E_2 \cdot E_1 \cdot A &= \begin{pmatrix} \frac{1}{a} & 0 \\ 0 & \frac{a}{ad - bc} \end{pmatrix} \cdot \begin{pmatrix} a & 0 \\ 0 & \frac{ad - bc}{a} \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} \end{aligned}\)

Then: \(A^{-1} = E_3 E_2 E_1\)

\(A^{-1}=\begin{pmatrix}\frac{1}{a} & 0\\ 0 & \frac{a}{ad-bc}\end{pmatrix}\cdot\begin{pmatrix}1 & -\frac{ba}{ad-bc}\\ 0 & 1\end{pmatrix}\cdot\begin{pmatrix}1 & 0\\ -\frac{c}{a} & 1\end{pmatrix}=\begin{pmatrix}\frac{1}{a} & 0\\ 0 & \frac{a}{ad-bc}\end{pmatrix}\cdot\begin{pmatrix}\frac{ad}{ad-bc} & -\frac{ba}{ad-bc}\\ -\frac{c}{a} & 1\end{pmatrix}=\begin{pmatrix}\frac{d}{ad-bc} & \frac{-b}{ad-bc}\\ \frac{-c}{ad-bc} & \frac{a}{ad-bc}\end{pmatrix}\)

\(A^{-1} = \frac{1}{ad - bc} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\), if \(a \neq 0\) and \(ad - bc \neq 0\).

If \(a = 0\) but \(c \neq 0\), it also works by \(\begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 0 & b \\ c & d \end{pmatrix} = \begin{pmatrix} c & d \\ 0 & b \end{pmatrix}\)

Then, if \(ad - bc \neq 0\), \(\exists A^{-1} \in \mathbb{R}^{2 \times 2}\).

A practical method to find the inverse

Let \(A = \begin{pmatrix} 1 & 2 \\ 3 & 4 \end{pmatrix}\)

Work with the arrangement: \(\left( \begin{array}{cc|cc} 1 & 2 & 1 & 0 \\ 3 & 4 & 0 & 1 \end{array} \right)\)

Perform row operations:

  1. \(R_2-3R_1\rightarrow R_2\Rightarrow\left(\begin{array}{cc|cc}1 & 2 & 1 & 0\\ 0 & -2 & -3 & 1\end{array}\right)\)

  2. \(-\frac12R_2\rightarrow R_2\Rightarrow\left(\begin{array}{cc|cc}1 & 2 & 1 & 0\\ 0 & 1 & \frac32 & -\frac12\end{array}\right)\)

  3. \(R_1-2R_2\rightarrow R_1\Rightarrow\left(\begin{array}{cc|cc}1 & 0 & -2 & 1\\ 0 & 1 & \frac32 & -\frac12\end{array}\right)\)

Now the matrix on the left side is the identity matrix \(I_2\), and the matrix on the right side is the inverse of \(A\): \(A^{-1} = \begin{pmatrix} -2 & 1 \\ \frac{3}{2} & -\frac{1}{2} \end{pmatrix}\)