# The spectral theorem and the singular value decomposition

· 2 minutes read

Given a linear operator $L$ on a finite-dimensional vector space $V$, we often want to find a basis of $V$ that gives the simplest possible matrix representation for $L$. The ideal case is when there is a basis for $V$ consisting of eigenvectors of $L$, in which $L$ can be represented by a diagonal matrix and is called “diagonalizable”. Suppose $v_1,\ldots,v_n$ is a basis for $V$ where each $v_i$ is an eigenvector of $L$ with corresponding eigenvalue $\lambda_i$ (not necessarily distinct), then

$L\begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} = \begin{bmatrix} Lv_1 & \cdots & Lv_n \end{bmatrix} = \begin{bmatrix} v_1 & \cdots & v_n \end{bmatrix} \diag(\lambda_1,\ldots,\lambda_n).$In other words, we have $L=V\Lambda V^{-1}$, where $V=\begin{bmatrix}v_1 & \cdots & v_n\end{bmatrix}$ and $\Lambda=\diag(\lambda_1,\ldots,\lambda_n).$ This is called the eigendecomposition of $L$.

A linear operator $L$ is diagonalizable if and only if both the following conditions hold:

- The characteristic polynomial $p(x)$ of $L$ splits (i.e., factors into product of linear polynomials).
- The algebraic multiplicity and geometric multiplicity of each eigenvalue $\lambda$ of $L$ are the same.
^{1}

The geometric multiplicty is always at most the algebraic multiplicty; equality occurs for all eigenvalues if and only if all geometric multiplicities sum to the degree of $p(x)$. More succintly, $L$ is diagonalizable if and only if $V$ is the direct sum of the eigenspaces of $L$, in which case

$L = \lambda_1 I_{E_{\lambda_1}} + \cdots + \lambda_k I_{E_{\lambda_k}},$where $I_{E_{\lambda_i}}$ is the identity on $E_{\lambda_i}$ (and zero outside of it).