# Tuesday November 19th ## Determinants Let $A\in M_n(R)$, where $R$ is a commutative unital ring. Given $A = (a_{ij})$, recall that $$ \det A = \sum_{\sigma \in S_n} \mathrm{sgn}(\sigma) \prod a_{i, \sigma(i)} .$$ This satisfies a number of properties: - $\det(AB) = \det A \det B$ - $A$ invertible $\implies$ $\det A$ is a unit in $R$ - $A \sim B \implies \det(A) = \det(B)$ - $\det A^t = \det A$ - $A$ is triangular $\implies \det A = \prod a_{ii}$. ### Calculating Determinants 1. **Gaussian Elimination** a. $B$ is obtained from $A$ by interchanging rows: $\det B = -\det A$ b. $B$ is obtained from $A$ by multiplying $\det B = r \det A$ c. $B$ is obtained from $A$ by adding a scalar multiple of one row to another: $\det B = \det A$. 2. **Cofactors** Let $A_{ij}$ be the $(n-1)\times (n-1)$ minor obtained by deleting row $i$ and column $j$, and $C_{ij} = (-1)^{i+j} \det A_{ij}$. Then **(theorem)** $\det A = \sum_{j=1}^n a_{ij} C_{ij}$ by expanding along either a row or column. **Theorem**: $$ A \mathrm{Adj}(A) = \det (A) I_n ,$$ where $\mathrm{Adj} = (C_{ij})^t$. If $A\inv$ is a unit, then $A\inv = \mathrm{Adj}(A) / \det(A)$. ### Decomposition of a Linear Transformation: Let $\phi: V \to V$ be a linear transformation of vector spaces. and $R = \hom_k(V, V)$. Then $R$ is a ring. Let $f(x) = \sum a_j x^j \in k[x]$ be an arbitrary polynomial. Then for $\phi \in R$, it makes sense to evaluate $f(\phi)$ where $\phi^n$ denotes an $n\dash$fold composition, and $f(\phi): V \to V$. **Lemma:** - There exists a unique monic polynomial $q_\phi(x) \in k[x]$ such that $q_\phi(\phi) = 0$ and $f(\phi) = 0 \implies q_\phi \divides f$. $q_\phi$ is referred to as the **minimal polynomial** of $\phi$. - The exact same conclusion holds with $\phi$ replaced by a matrix $A$, yielding $q_A$. - If $A$ is the matrix of $\phi$ relative to a fixed basis, then $q_\phi = q_A$. *Proof of a and b:* Fix $\phi$, and define \begin{align*} \Gamma: k[x] &\to \hom_k(V, V) \\ f &\mapsto f(\phi) .\end{align*} Since $\dim_k V\dual = \dim_k V < \infty$ and $\dim_k k[x] = \infty$, we must have $\ker \Gamma \neq 0$. Since $k[x]$ is a PID, we have $\ker \Gamma = (q)$ for some $q\in k[x]$. Then if $f(\phi) = 0$, we have $f(x) \in \ker \Gamma \implies q \divides f$. We can then rescale $q$ to be monic, which makes it unique. > Note: for (b), just replace $\phi$ with $A$ everywhere. $\qed$ *Proof of c:* Suppose $A = [\phi]_\mathcal{B}$ for some fixed basis $\mathcal B$. Then $\hom_k(V, V) \cong M_n(k)$, so we have the following commutative diagram: ```{=latex} \begin{center} \begin{tikzcd} {k[x]} \arrow[rr, "\Gamma_\phi"] \arrow[rrdd, "\Gamma_A"] & & {\hom_k(V, V)} \arrow[dd, "\cong", two heads, hook] \\ & & \\ & & M_n(k) \end{tikzcd} \end{center} ``` $\qed$ ### Finitely Generated Modules over a PID Let $M$ be a finitely generated module over $R$ a PID. Then \begin{align*} M &\cong F \oplus \bigoplus_{i=1}^n R/(r_i) \quad r_1 \divides r_2 \divides \cdots r_n \\ M &\cong F \oplus \bigoplus_{i=1}^n R/(p_i^{s_i}) \quad p_i \text{ not necessarily distinct primes. } .\end{align*} Letting $R = k[x]$ and $\phi: V\to V$ with $\dim_k V < \infty$, $V$ becomes a $k[x]\dash$module by defining $$ f(x) \actson \vector v \definedas f(\phi)(\vector v) $$ Note that $W$ is a $k[x]\dash$submodule iff $\phi: W \to W$. Let $v\in V$, and $\generators{v} = \theset{\phi^i(v) \suchthat i = 0,1,2,\cdots}$ is the **cyclic submodule generated by $v$**, and we write $\generators{v} = k[x].v$. **Theorem:** Let $\phi: V\to V$ be a linear transformation. Then 1. There exist cyclic $k[x]\dash$submodules $V_i$ such that $V = \bigoplus_{i=1}^t V_i$, where for each $i$ there exists a $q_i: V_i \to V_i$ such that $q_1 \divides q_2 \divides \cdots \divides q_t$. 2. There exist cyclic $k[x]\dash$submodules $V_j$ such that $V = \bigoplus_{j=1}^\nu$ and $p_j^{m_j}$ is the minimal polynomial of $\phi: V_j \to V_j$. *Proof:* Apply the classification theorem to write $V = \bigoplus R/(r_i)$ as an invariant factor decomposition. Then $R/(q_i) \cong V_i$, some vector space, and since there is a direct sum decomposition, the invariant factors are minimal polynomials for $\phi_i: V_i \to V_i$, and thus $k[x]/(q_i)$. $\qed$ ### Canonical Forms for Matrices We'll look at - Rational Canonical Form - Jordan Canonical Form **Theorem**: Let $\phi: V\to V$ be linear, then $V$ is a cyclic $k[x]\dash$module and $\phi: V\to V$ has minimal polynomial $q(x) = \sum_j a_j x^j$ iff $\dim V = n$ and $V$ has an ordered basis of the form $$ [\phi]_{\mathcal{B}} = \left[ \begin{array}{ccccc} 0 & 1 & 0 & \cdots & 0 \\ 0 & 0 & 1 & \cdots & 0 \\ \vdots & \vdots & \vdots & \cdots & \vdots \\ -a_0 & -a_1 & -a_2 & \cdots & -a_{n-1} \end{array} \right] $$ with ones on the super-diagonal. *Proof:* $\impliedby$: Let $V = k[x].v = \generators{v, \phi(v), \cdots, \phi^{n-1}(v)}$ where $\deg q(x) = n$. The claim is that this is a linearly independent spanning set. Linear independence: suppose $\sum_{j=0}^{n-1} k_j \phi^j(v) = 0$ with some $k_j \neq 0$. Then $f(x) = \sum k_j x^j$ is a polynomial where $f(\phi) = 0$, but this contradicts the minimality of $q(x)$. But then we have $n$ linearly independent vectors in $V$ which is dimension $n$, so this is a spanning set. $\implies$: We can just check where basis elements are sent. Set $\mathcal{B} = \theset{v, \phi(v), \cdots, \phi^{n-1}(v)}$. Then \begin{align*} v &\mapsto \phi(v) \\ \phi(v) &\mapsto \phi^2(v) \\ &\vdots \\ \phi{n-1}(v) &\mapsto \phi^n(v) = -\sum a_i \phi^i(v) \\ .\end{align*} $\impliedby$ Fix a basis $B = \theset{v_1, \cdots, v_n}$ and $A = [\phi]_B$, then \begin{align*} v_1 &\mapsto v_2 = \phi(v_1) \\ v_1 &\mapsto v_3 = \phi^2(v_1) \\ v_{n-2} &\mapsto v_{n-1} = \phi^2(v_1) .\end{align*} and $$ \phi^n(v) = -a_k v_1 \neq -a_1 \phi(v_1), \cdots -a_{n-1} \phi^{n-1}(v_1) .$$ Thus $V = k[x].v_1$, since $\dim V = n$ with $\theset{v_1, \phi(v_1), \cdots, \phi^{n-1}(v_1)}$ as a basis. $\qed$