# Friday August 23 Chapter 3: Theorems of Lie and Cartan ## 4.1: Lie's Theorem **Theorem:** Let $L$ be a solvable subalgebra of $\liegl(V)$, where $V$ is finite-dimensional. If $V\neq 0$, then $V$ contains a common eigenvector for all of the endomorphisms in $L$. *Proof:* Use induction on $\dim L$. The case $\dim L = 0$ is trivial. We'll attempt to mimic the proof of Theorem 3.3. The idea is to 1. Locate an ideal of $K$ of codimension 1, 2. Show by induction that common eigenvectors exist for $k$, 3. Verify that $L$ stabilizes a space consisting of such eigenvectors, 4. Find in that space an eigenvector for a single $z \in L$ satisfying $L = K + Fz$. **Step (1):** Since $L$ is solvable and of positive dimension, then $L \lneq [L, L]$. Otherwise, if $L = [L, L]$, then $L^{(1)} = L \implies L^{(n)} = L$, which would contradict $L$ being solvable. Since $[L, L]$ is abelian, any subspace is automatically an ideal. So take a subspace of codimension one, then its inverse image $K \normal L$ is an ideal satisfying $[L, L] \subseteq K$. **Step (2):** Use induction to find a common eigenvector $v\in V$ for $K$. ($K$ is solvable; if $K=0$ then $L$ is abelian of dimension 1 and any eigenvector for a basis vector of $L$ finishes the proof.) This means that $x\in K \implies x\actson v = \lambda(x) v$ for some $\lambda: K \to F$ a linear functional. Fix this $\lambda$, and let $W = \theset{w\in V \suchthat x\actson w = \lambda(x) w \forall x\in K}$; note that $W \neq 0$. **Step (3):** This will involve showing that $L$ leaves $W$ invariant. Assume for the moment that this is done, and proceed to step (4). **Step (4):** Write $L = K + Fz$. Since $F$ is algebraically closed, we can find an eigenvector $v_0 \in W$ of $z$ for some eigenvalue of $z$. Then $v_0$ is a common eigenvector for $L$, and $\lambda$ can be extended to a linear function on $L$ satisfying $x\actson v_0 = \lambda(x) v_0$ where $x\in L$. It remains to show that $L$ stabilizes $W$. Let $w\in W, x\in L$. To test whether or not $x\actson w \in W$, we take an arbitrary $y\in K$ and examine $$ yx \actson w = xy \actson w - [x, y]\actson w = \lambda(y)x\actson w - \lambda([x, y]) w. $$ > Note: the above equality is an important trick. Thus we need to show that $\lambda([x, y]) = 0$. To this end, fix $w\in W, x\in L$. Let $n > 0$ be the smallest integer for which $w, x\actson w, \cdots x^n \actson w$ are all linearly *independent*. Let $W_i = \mathrm{span}(\theset{w, x\actson w, \cdots x^{i-1} \actson w})$ and set $W_0 = 0$. Then $\dim W_n = n$, and $W_{n+i} = W_n$ for all $i\geq 0$. Moreover, $x$ maps $W_n$ into itself. It is easy to check that each $y\in K$ is represented by an upper-triangular matrix with diagonal entries equal to $\lambda(y)$. This follows immediately from the congruence $$ yx^i \actson w = \lambda(y) x^i \actson w \mod W_i, $$ which can by proved by induction on $i$. The case $i=0$ is trivial. For the inductive step, write $$ yx^i \actson i = yxx^{i-1}\actson w = xyx^{i-1}\actson w = [x, y]x^{i-1} \actson w $$ By induction, $$ yx^{i-1} = \lambda(y) x^{i-1}\actson w + w', $$ where $w' \in W_{i-1}$. Since $x$ maps $W_{i-1}$ into $W_i$ by construction, the congruence holds for all $i$. According to our description of the action of $y\in K$ on $W_n$, we have $\Tr_{W_n}(y) = n\lambda(y)$. In particular, this is true for elements $k$ of $f$ of the special form $[x, y]$ where $x$ is as it was above and $y$ is in $K$. **But both $x$ and $y$ stabilize $W_n$, so $[x, y]$ acts on $W_n$ as the commutator of two endomorphisms of $W_n$, and the trace is therefore zero.** We conclude that $n\lambda([x, y]) = 0$. Since $\mathrm{char} F = 0$, this forces $\lambda([x, y]) = 0$ as required. $\qed$ **Corollary A (Lie's Theorem):** Let $L \leq \liegl(V)$ be a solvable subalgebra where $\dim V = n < \infty$. Then $L$ stabilizes some flag in $V$, i.e. the matrices of $L$ relative to a suitable basis of $V$ are upper triangular. *Proof:* Use the above theorem, along with induction on $\dim V$. This is similar to the proof of corollary 3.3. ## 4.2: Jordan-Chevalley Decomposition **Fact 1:** The Jordan Canonical Form of a single endomorphism $x$ over $F$ algebraically closed is an expression of $x$ in matrix form as a sum of blocks: ![Image](figures/2019-09-29-20:34.png)\ **Fact 2:** Call $x\in \mathrm{End} V$ *semisimple* if the roots of its minimal polynomial over $F$ are all distinct. Equivalently, if $F$ is algebraically closed, then $x$ is semisimple iff $x$ is diagonalizable. **Fact 3:** Two commuting semisimple endomorphisms can be simultaneously diagonalized. Therefore, their sum or difference is again semisimple. **Proposition:** Let $V$ be a finite dimensional vector space over $F$ and $x\in\mathrm{End} V$. Then a. There exist unique $x_s, x_n \in \mathrm{End} V$ satisfying the conditions $x = x_s + x_n$, $x_s$ is semisimple, $x_n$ is nilpotent, and $x_s, x_n$ commute. b. There exists polynomials $p(t), g(t)$ such that $x_s = p(x)$ and $x_n = g(x)$. In particular, $x_s, x_n$ commute with any endomorphism commuting with $x$. c. If $A < B < V$ are subspaces and $x$ maps $B$ into $A$, then $x_s, x_n$ also map $B$ into $A$. The decomposition $x = x_s + x_n$ is called the (additive) **Jordan-Chevalley decomposition** of $x$, or just the Jordan decomposition. $x_s, x_n$ are respectively called the **semisimple part** and the **nilpotent part** of $x$. *Example:* \begin{align*} x = \left(\begin{array}{ll}{1} & {1} \\ {0} & {1}\end{array}\right) \implies x_s = \left(\begin{array}{ll}{1} & {0} \\ {0} & {1}\end{array}\right),\quad x_n = \left(\begin{array}{ll}{0} & {1} \\ {0} & {0}\end{array}\right) .\end{align*} Note that $x_s x_n = x_n = x_n x_s$, $x_s = 2x - x^2$, and $x_n = x^2 - x$. We thus have $p(t) = 2t-t^2$ and $q(t) = t^2 - t$.