# Thursday November 14th ## Equivalence to Canonical Forms Let $D$ be a division ring and $k$ a field. Recall that a matrix $A$ is *equivalent* to $B \iff \exists P, Q$ such that $PBQ=A$. From a previous theorem, if $\rank(A) = r$, then $A$ is equivalent to a matrix with $I_r$ in the top-left block. **Theorem:** Let $A$ be a matrix over a PID $R$. Then $A$ is equivalent to a matrix with $L_r$ in the top-left corner, where $L_r = \mathrm{diag}(d_1, d_2, \cdots, d_r)$ and $d_1 \divides d_2 \divides \cdots \divides d_r$, and the $d_i$ are uniquely determined. **Theorem:** Let $A$ be an $n\times n$ matrix over a division ring $D$. TFAE: 1. $\rank A = n$. 2. $A$ is equivalent to $I_n$. 3. $A$ is invertible. $1\implies 2$: Use Gaussian elimination. $2\implies 3$: $A = PI_n Q = PQ$ where $P, Q$ are invertible, so $PQ = A$ is invertible. $3\implies 1$: If $A$ is invertible, then $A: D^n \to D^n$ is bijective and thus surjective, so $\dim \im A = n$. > Note: the image is now *row space* because we are taking *left* actions. $\qed$ ## Determinants **Definition:** Let $M_1, \cdots, M_n$ be $R\dash$modules, and then $f: \prod M_i \to R$ is $n\dash$linear iff \begin{align*} f( m_1, m_2, \cdots, rm_k + sm_k', \cdots, m_n ) = \\ r f( m_1, \cdots, m_k, \cdots m_k) + sf(m_1, \cdots, m_k', \cdots, m_n ) .\end{align*} *Example:* The inner product is a 2-linear form. **Definition:** $f$ is **symmetric** iff $$ f(m_1, \cdots, m_n) = f(m_{\sigma(1)}, \cdots, m_{\sigma(n)}) ~~\forall \sigma \in S_n .$$ **Definition:** $f$ is **skew-symmetric** iff $$ f(m_1, \cdots, m_n) = \mathrm{sgn}(\sigma) f(m_{\sigma(1)}, \cdots, m_{\sigma(n)}) ~~\forall \sigma \in S_n ,$$ where $$ \mathrm{sgn}(\sigma) = \begin{cases} 1 & \sigma \text{ is even } \\ -1 & \sigma \text{ is odd } \end{cases} .$$ **Definition:** $f$ is **alternating** iff $$ m_i = m_j \text{ for some pair } (i, j) \implies f(m_1, \cdots, m_n) = 0 .$$ **Theorem:** Let $f$ be an $n\dash$linear form. If $f$ is alternating, then $f$ is skew-symmetric. *Proof:* It suffices to show the $n=2$ case. We have \begin{align*} 0 &= f(m+1 + m_2, m_1 + m_2) \\ &= f(m_1, m_1) + f(m_1, m_2) + f(m_2, m_1) + f(m_2, m_2) \\ &= f(m_1, m_2) + f(m_2, m_1)\\ \implies f(m_1, m_2) &= - f(m_2, m_1) .\end{align*} $\qed$ **Theorem:** Let $R$ be a unital commutative ring and let $r\in R$ be arbitrary. Then $$ \exists! f: \bigoplus_{i=1}^n R^n \to R ,$$ where $f$ is an alternating $R\dash$form such that $f(\vector e_i) = r$ for all $i$, where $\vector e_i = [0, 0, \cdots, 0, 1, 0, \cdots, 0, 0]$. > $R^n$ is a free module, so $f$ can be identified with a matrix once a basis is chosen. *Proof*: *Existence:* Let $x_i = [a_{i1}, a_{i2}, \cdots, a_{in}]$ and define $$ f(x_1, \cdots, x_n) = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) r \prod_i a_{i \sigma(i)} .$$ *Exercise:* Check that $f(\vector e_1, \cdots, \vector e_n) = r$ and $f$ is $n\dash$linear. Moreover, $f$ is alternating. Consider $f(x_1, \cdots, x_n)$ where $x_i = x_j$ for some $i\neq j$. Letting $\phi = (i, j)$, we can write $S_n = A_n \disjoint A_n \rho$. If $\sigma$ is even, then the summand is $$ (+1)r a_{1\sigma(1)} \cdots a_{n\sigma(n)} .$$ Since $x_i = x_j$, we'll have $\prod_k a_{ik} = \prod a_{jk}$. Then consider applying $\sigma \rho$. We have \begin{align*} -r \prod a_{i\sigma(i)} &= -r a_{1\sigma(1)} \cdots \mathbf{a}_{j \sigma(j)} \cdots \mathbf{a}_{i \sigma(i)} \cdots a_{n, \sigma(n)} \\ &= -r \prod a_{i\sigma(i)} = -r a_{1\sigma(1)} \cdots \mathbf{a}_{i \sigma(i)} \cdots \mathbf{a}_{j \sigma(j)} \cdots a_{n, \sigma(n)} ,\end{align*} which permutes the $i,j$ terms. So these two terms cancel, the remaining terms are untouched. *Uniqueness*: Let $x_i = \sum_j a_{ij} \vector e_j$. Then \begin{align*} f(x_1, \cdots, x_n) &= f(\sum_{j_1} a^1_j \vector e_j, \cdots, \sum_{j_n} a^n_j \vector e_j) \\ &= \sum_{j_1} \cdots \sum_{j_n} f(\vector e_{j_1}, \cdots, \vector e_{j_n} ) a_{1, j_1} \cdots a_{n, j_n} \\ &= \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) f(\vector e_1, \cdots, \vector e_n) a_{1, \sigma(1)} \cdots a_{n, \sigma(n)} \\ &= \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) r a_{1, \sigma(1)} \cdots a_{n, \sigma(n)} .\end{align*} $\qed$ **Definition:** Let $R$ be a commutative unital ring and define $\mathrm{det}: M_n(R) \to R$ is the unique $n\dash$alternating form with $\det(I) = 1$, and is called the *determinant*. **Theorem:** Let $A, B \in M_{n}(R)$. Then a. $\abs{AB} = \abs A \abs B$ b. $A$ is invertible $\iff \abs{A} \in R\units$ c. $A \sim B \implies \abs A = \abs B$. d. $\abs{A^t} = \abs A$. e. If $A$ is triangular, then $\abs A$ is the product of the diagonal entries. *Proof of a:* Let $B$ be fixed. Let $\Delta_B: M_n(R) \to R$ be defined as $C \mapsto \abs{CB}$. Then this is an alternating form, so by the theorem, $\Delta_B = r \mathrm{det}$. But then $\Delta_B(C) = r\abs{C}$, so $r\abs{C} = \abs{CB}$. So pick $C = I$, then $r = \abs{B}$. $\qed$ *Proof of b:* Suppose $A$ is invertible. Then $AA\inv = I$, so $\abs{AA\inv} = \abs{A}\abs{A\inv} = 1$, which shows that $\abs A$ is a unit. $\qed$ *Proof of c:* Let $A = PBP\inv$. Then $$ \abs A = \abs{PBP\inv} = \abs P \abs B \abs{P\inv} = \abs{P} \abs{P\inv} \abs B = \abs B .$$ $\qed$ *Proof of d:* Let $A = (a_{ij})$, so $B = (b_{ij}) = (a_{ji})$. Then \begin{align*} \abs{A^t} &= \sum_{\sigma} \mathrm{sgn}(\sigma) \prod_k b_{k \sigma(k)} \\ &= \sum_\sigma \mathrm{sgn}(\sigma) \prod_k a_{\sigma(k) k} \\ &= \sum_{\sigma\inv} \mathrm{sgn}(\sigma) \prod_k a_{k \sigma\inv(k)} \\ &= \sum_\sigma \mathrm{sgn}(\sigma) \prod_k a_{k \sigma(k)} \\ &= \abs {A} .\end{align*} $\qed$ *Proof of e:* Let $A$ be upper-triangular. Then $$ \abs A = \sum_\sigma \mathrm{sgn}(\sigma) \prod_k a_{k \sigma(k)} = a_{11} a_{22} \cdots a_{nn} .$$ $\qed$ Next time: - Calculate determinants - Gaussian elimination - Cofactors - Formulas for $A\inv$ - Cramer's rule