Skip to content

9.11

Definition

Let \(A \in M_{m \times n}(\mathbb{R})\) and let \(k\) be a positive integer. We define \(A^0 = I_n\), \(A^k = A^{k-1}A\)

Example

Let \(A = \begin{pmatrix} 2 & -1 \\ 3 & 0 \end{pmatrix}\), then
\(A^2 = A \cdot A = \begin{pmatrix} 2 & -1 \\ 3 & 0 \end{pmatrix} \begin{pmatrix} 2 & -1 \\ 3 & 0 \end{pmatrix} = \begin{pmatrix} 1 & -2 \\ 6 & -3 \end{pmatrix}\) and \(A^3 = A^2 A = \begin{pmatrix} 1 & -2 \\ 6 & -3 \end{pmatrix} \begin{pmatrix} 2 & -1 \\ 3 & 0 \end{pmatrix} = \begin{pmatrix} -4 & -1 \\ 3 & -6 \end{pmatrix}\)

Theorem

If \(A \in M_{n \times n}(\mathbb{R})\) and \(k,l\) are nonnegative integers, then:
\((1)\ A^k A^l = A^{k+l}\)
\((2)\ (A^k)^l = A^{kl}\)

Definition

If \(A = (a_{ij})_{n \times n}\), the trace of \(A\), denoted \(\text{Tr}(A)\), is defined as the sum of all the elements of the main diagonal, that is, \(\text{Tr}(A) = a_{11} + a_{22} + \cdots + a_{nn} = \sum_{i=1}^{n} a_{ii}\)

Example

If \(A = \begin{pmatrix} 1 & 0 & 7 \\ 2 & -3 & 4 \\ 5 & 6 & 9 \end{pmatrix}\), then \(\text{Tr}(A) = 1 - 3 + 9 = 7\).

Theorem

If \(A, B \in M_{n \times n}(\mathbb{R})\) and \(\alpha \in \mathbb{R}\), then:
\((1)\ \text{Tr}(A + B) = \text{Tr}(A) + \text{Tr}(B)\),
\((2)\ \text{Tr}(\alpha A) = \alpha \text{Tr}(A)\).

Proof
(1) If \(A = (a_{ij})_{n \times n}\), \(B = (b_{ij})_{n \times n}\), then \(A + B = (a_{ij} + b_{ij})_{n \times n}\), hence
\(\text{Tr}(A + B) = \sum_{i=1}^{n}(a_{ii} + b_{ii}) = \sum_{i=1}^{n} a_{ii} + \sum_{i=1}^{n} b_{ii} = \text{Tr}(A) + \text{Tr}(B)\)

Theorem

Let \(A, B\) be matrices with sizes such that the following operations are valid, then:
\((1)\ (A^T)^T = A\)
\((2)\ (A + B)^T = A^T + B^T\)
\((3)\ (\alpha A)^T = \alpha A^T\)
\((4)\ (AB)^T = B^T A^T\)
\((5)\ (A^k)^T = (A^T)^k\) with \(k\) a nonnegative integer

Definition

Let \(A = (a_{ij})_{n \times n}\)
\(A\) is called symmetric if \(A^T = A\), that is, \(a_{ji} = a_{ij}\) for all \(1 \leq i,j \leq n\).
\(A\) is called antisymmetric if \(A^T = -A\), that is, \(a_{ji} = -a_{ij}\) for all \(1 \leq i,j \leq n\).

Remark

If \(A = (a_{ij})_{n \times n}\) is an antisymmetric matrix, then the entries on the main diagonal are \(0\). In fact, for all \(1 \leq i,j \leq n\): \(a_{ji} = -a_{ij}\)
In particular \(a_{ji}=-a_{ij}\Rightarrow a_{ii}=-a_{ii}\Rightarrow2a_{ii}=0\Rightarrow a_{ii}=0\)

Theorem

Let \(A \in M_{n \times n}(\mathbb{R})\) be a symmetric matrix and \(\alpha \in \mathbb{R}\), then:
\((1)\ A^T\) is symmetric,
\((2)\ \alpha A\) is symmetric.

Proof
\((1)\ (A^T)^T = A^T = A\) and thus \(A^T\) is symmetric.
\((2)\ (\alpha A)^T = \alpha A^T = \alpha A\) and thus \(\alpha A\) is symmetric.

Theorem

\((1)\) Let \(A \in M_{n \times n}(\mathbb{R})\), then \(A + A^T\) is symmetric.
\((2)\) Let \(A \in M_{n \times n}(\mathbb{R})\), then \(AA^T\) and \(A^T A\) are symmetric.

Proof:
\((1)\ (A + A^T)^T = A^T + (A^T)^T = A^T + A = A + A^T\), thus \(A + A^T\) is symmetric.
\((2)\ (AA^T)^T = (A^T)^T A^T = A A^T\), thus \(AA^T\) is symmetric.
\(~~~~~\ (A^T A)^T = A^T (A^T)^T = A^T A\), thus \(A^T A\) is symmetric.

Example

Let \(A = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \end{pmatrix}\), then \(A^T = \begin{pmatrix} 1 & 2 & 3 \\ 2 & 4 & 5 \\ 3 & 5 & 6 \end{pmatrix} = A\), that is, \(A\) is symmetric.

Let \(A = \begin{pmatrix} 0 & -2 & -3 \\ 2 & 0 & 4 \\ 3 & -4 & 0 \end{pmatrix}\), then \(A^{T}=\begin{pmatrix}0 & 2 & 3\\ -2 & 0 & -4\\ -3 & 4 & 0\end{pmatrix}=-\begin{pmatrix}0&-2&-3\\2&0&4\\3&-4&0\end{pmatrix}=-A\), that is, \(A\) is antisymmetric.

Definition

A matrix \(A = (a_{ij})_{n \times n}\) is called invertible if there exists \(B = (b_{ij})_{n \times n}\) such that \(AB = I_n = BA\). The matrix \(B\) is called the inverse of \(A\) and is denoted \(A^{-1}\).

Theorem

If \(A = (a_{ij})_{n \times n}\) is invertible, then its inverse is unique.

Proof
Suppose \(B\) and \(C\) are inverses of \(A\), then \(AB = I_n = BA\) and \(AC = I_n = CA\)
Now, \(B = B I_n = B (AC) = (BA) C = I_n C = C\)

Remark

If \(A = \begin{pmatrix} a & b \\ c & d \end{pmatrix}\) with \(ad - bc \neq 0\), then \(A\) is invertible and \(A^{-1} = \frac{1}{ad - bc} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\)

In fact, let \(B = \frac{1}{ad - bc} \begin{pmatrix} d & -b \\ -c & a \end{pmatrix}\), then:
\(\begin{aligned}AB & =\begin{pmatrix}a & b\\ c & d\end{pmatrix}\left\lbrack\frac{1}{ad-bc}\begin{pmatrix}d & -b\\ -c & a\end{pmatrix}\right\rbrack=\frac{1}{ad-bc}\left\lbrack\begin{pmatrix}a & b\\ c & d\end{pmatrix}\begin{pmatrix}d & -b\\ -c & a\end{pmatrix}\right\rbrack=\frac{1}{ad-bc}\begin{pmatrix}(ad-bc) & 0\\ 0 & (ad-bc)\end{pmatrix}=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}\\ BA & =\left\lbrack\frac{1}{ad-bc}\begin{pmatrix}d & -b\\ -c & a\end{pmatrix}\right\rbrack\begin{pmatrix}a & b\\ c & d\end{pmatrix}=\frac{1}{ad-bc}\left\lbrack\begin{pmatrix}d & -b\\ -c & a\end{pmatrix}\begin{pmatrix}a & b\\ c & d\end{pmatrix}\right\rbrack=\frac{1}{ad-bc}\begin{pmatrix}(ad-bc) & 0\\ 0 & (ad-bc)\end{pmatrix}=\begin{pmatrix}1 & 0\\ 0 & 1\end{pmatrix}\end{aligned}\)
Therefore, \(B = A^{-1}\).

Theorem

Let \(A, B \in M_{n \times n}(\mathbb{R})\) be invertible matrices and \(\alpha \in \mathbb{R}\), then:
\((1)\ A^{-1}\) is invertible and \((A^{-1})^{-1} = A\),
\((2)\ \alpha A\) is invertible and \((\alpha A)^{-1} = \frac{1}{\alpha} A^{-1}\),
\((3)\ AB\) is invertible and \((AB)^{-1} = B^{-1} A^{-1}\),
\((4)\ A^T\) is invertible and \((A^T)^{-1} = (A^{-1})^T\).