# Monday, November 07 :::{.remark} Last time: constructing a semisimple Lie algebra that has a given root system. Setup: - $\Delta = \ts{\alpha_1,\cdots, \alpha_l}$. - $\hat L$ the free Lie algebra on $\ts{\hat x_i, \hat h_i, \hat y_i}_{1\leq i\leq l}$. - $\hat K$ the ideal generated by the Serre relations. - $L_0 \da \hat L/\hat K$ with quotient map $\pi$. - $\hat \phi: \hat L\to \liegl(V)$ a representation we constructed. - $\hat H$ the free Lie algebra on $\ts{h_i}$. ::: :::{.theorem title="?"} $\hat K \subseteq \ker \hat\phi$, so $\hat\phi$ induces a representation $\phi$ of $L_0$ on $\liegl(V)$ \begin{tikzcd} {\hat L} && {\liegl(V)} \\ \\ & {L_0} \arrow["\hat\phi", from=1-1, to=1-3] \arrow["\pi"', from=1-1, to=3-2] \arrow["{\exists \phi}"', dashed, from=3-2, to=1-3] \end{tikzcd} > [Link to Diagram](https://q.uiver.app/?q=WzAsMyxbMCwwLCJcXGhhdCBMIl0sWzIsMCwiXFxsaWVnbChWKSJdLFsxLDIsIkxfMCJdLFswLDEsIlxcaGF0XFxwaGkiXSxbMCwyLCJcXHBpIiwyXSxbMiwxLCJcXGV4aXN0cyBcXHBoaSIsMix7InN0eWxlIjp7ImJvZHkiOnsibmFtZSI6ImRhc2hlZCJ9fX1dXQ==) ::: :::{.proof title="?"} Straightforward but tedious checking of all relations, e.g. \[ \hat\phi(\hat h_i) \circ \hat\phi(\hat x_j) - \hat \phi(\hat x_j )\hat\phi(\hat h_i) = \inp{ \alpha_j}{\alpha_i} \hat \phi(\hat x_j) .\] ::: :::{.theorem title="?"} In $L_0$, the $h_i$ form a basis for an $\ell\dash$dimensional abelian subalgebra $H$ of $L_0$, and moreover $L_0 = Y \oplus H \oplus X$ where $Y, X$ are the subalgebras generated by the $x_i$ and $y_i$ respectively. ::: :::{.proof title="?"} **Steps 1 and 2**: :::{.claim} $\pi(\hat H) = H$ is $\ell\dash$dimensional. ::: Clearly the $\hat h_i$ span an $\ell\dash$dimensional subspace of $\hat L$, so we need to show that $\pi$ restricts to an isomorphism $\pi: \hat H\iso H$. Suppose $\hat h \da \sum_{j=1}^\ell c_j \hat h_j \in \ker \pi$, so $\hat\phi (\hat h) = 0$. Thus \[ 0 = \hat h \cdot v_i = \sum_{j} c_j \hat{h}_j \cdot v_i = - \sum_j c_j \inp{ \alpha_i}{\alpha_j} = - \sum_j a_{ij} c_j \qquad \forall i ,\] so $A\vector c = 0$ where $A$ is the Cartan matrix, and so $\vector c = \vector 0$ since $A$ is invertible (since it was essentially a Gram matrix). **Step 3**: Now $\sum \FF x_i + \sum \FF h_i + \sum \FF y_i \mapsvia{\pi} L_0$ maps isomorphically to $L_0$, and S2, S3 show that for each $i$. Then $\FF x_i + \FF h_i + \FF y_i$ is a homomorphic image of $\liesl_2$, which is simple if $\characteristic \FF \neq 2$. Note $\pi( \hat h_i) = h_i \neq 0$ in $L_0$ by (1), so this subspace of $L_0$ is isomorphic to $\liesl_2(\FF)$. In particular $\ts{x_i, h_i, y_i}$ is linearly independent in $L_0$ for each fixed $i$. Supposing $0 = \sum_{j=1}^\ell (a_j x_j + b_j h_j + c_j y_j)$, applying $\ad_{L_0, h_i}$ for each $i$ to obtain \[ 0 = \sum_{j=1}^\ell \qty{ a_j\inp{\alpha_j}{\alpha_i} x_j + b_j 0 - c_j \inner{ \alpha_j }{ \alpha_i } y_j } = \sum_{j=1}^\ell \inner{ \alpha_j }{ \alpha_i } (a_j x_j - x_y y_j) ,\] and by invertibility of $A$ we have $a_j x_j - c_j y_j = 0$ for each $j$. So $a_j = c_j = 0$ for all $j$, and $\sum b_j h_j = 0$ implies $b_j = 0$ for all $j$ from (1). **Step 4**: $H = \sum_{j=1}^\ell \FF h_j$ is an $\ell\dash$dimensional abelian subalgebra of $L_0$ by (1) and S1. **Step 5**: Write $[x_{i_1}\cdots x_{i_t}] \da [x_{i_1} [ x_{i_2} [ \cdots [x_{i_{t-1}} x_{i_t}] \cdots ]] \in X$ for an iterated bracket, taken by convention to be bracketing from the right. We have \[ \ad_{L_0, h_j}([x_{i_1} \cdots x_{i_t}] ) = \qty{ \inner{ \alpha_{i_1} }{ \alpha_j } + \cdots + \inner{ \alpha_{i_t} }{ \alpha_j } } [x_{i_1} \cdots x_{i_t}] \qquad t\geq 1 ,\] and similarly for $[y_{i_1}\cdots y_{i_t}]$. **Step 6**: For $t\geq 2$, $[y_j [ x_{i_1} \cdots x_{i_t} ] ] \in X$, and similarly with the roles of $x_i, y_i$ reversed. This follows from the fact that $\ad_{L_0, y_j}$ acts by derivations, and using S2 and S3. **Step 7**: It follows from steps 4 through 6 that $Y+H+X$ is a subalgebra of $L_0$. One shows that $[[ x_{i_1} \cdots x_{i_t}], [y_{i_1} \cdots y_{i_t} ]]$, which comes down to the Jacobi identity and induction on $s+t$. E.g. \[ [ [x_1 x_2], [y_3 y_4] ] = [x_1[ x_2 [y_3 y_4 ] ] ] - [x_2 [x_1 [y_3 y_4]] \in [x_1, \FF y_3 + \FF y_4] + \cdots \in H + \cdots ,\] which lands in $H$ since there are as many $x_i$ as $y_i$, whereas if there are more $x_i$ than $y_i$ this lands in $X$, and so on. Since $Y+H+X$ is a subalgebra that contains the generators $x_i, h_i, y_i$ of $L_0$, it must be equal to $L_0$. **Step 8**: The decomposition $L_0 = X + H + Y$ is a direct sum decomposition of $L_0$ into submodule for the adjoint action of $H$. Use the computation in the previous step to see that every element of $X$ is a linear combination of elements $[x_{i_1}\cdots x_{i_t}]$ and similarly for $Y$. These are eigenvectors for the action of $\ad_H \actson L_0$ by (5), and eigenfunctions for $X$ have the form $\lambda = \sum_{i=1}^\ell c_i \alpha_i$ with $c_i \in \ZZ_{\geq 0}$. The \( \lambda_i \) is referred to as a **weight**, and $c_i$ is the number of times $i$ appears is an index in $i_1,\cdots, i_t$. So every weight space $X_\lambda$ is finite-dimensional, and the weights of $Y$ are $-\lambda$. Since the weights in $X, H, Y$ are all different, their intersections must be trivial and the sum is direct. ::: :::{.remark} $L_0 = Y \bigoplus H \bigoplus X$ is known as the **triangular decomposition**, where the $x_i$ are on the super diagonal and bracket to upper-triangular elements, and the $y_i$ are their transposes. :::