Eigenvalues, Eigenvectors, and Invariant Subspaces (Part 2 & 3)
Based on Linear Algebra Done Right
This post covers the definition of the matrix of an operator, the properties of upper triangular matrices, and the concept of eigenspaces.
1. The Matrix of an Operator
Previously, defining the matrix of a linear map from one vector space to another required two bases. However, for an operator (a linear map $T: V \to V$), we use a single basis.
Definition
Let $v_1, \dots, v_n$ be a basis of $V$. The matrix of the operator $T$ with respect to this basis is the $n \times n$ matrix determined by:
$$ T(v_k) = A_{1,k}v_1 + \dots + A_{n,k}v_n $$The coefficients of this linear combination form the $k$-th column of the matrix.
- It is computed using a single basis of $V$.
- It is always a square matrix ($n \times n$).
2. Upper Triangular Matrix
A matrix is called upper triangular if all entries below the diagonal are zero (i.e., $A_{j,k} = 0$ if $j > k$).
Define $T \in \mathcal{L}(\mathbf{R}^3)$ by $T(x, y, z) = (2x + y, 5y + 3z, 8z)$.
Using the standard basis $(1,0,0), (0,1,0), (0,0,1)$:
- $T(1,0,0) = (2,0,0) \implies$ Col 1: $[2, 0, 0]^T$
- $T(0,1,0) = (1,5,0) \implies$ Col 2: $[1, 5, 0]^T$
- $T(0,0,1) = (0,3,8) \implies$ Col 3: $[0, 3, 8]^T$
Theorem: Conditions for Upper Triangular Matrix
Let $T \in \mathcal{L}(V)$ and let $v_1, \dots, v_n$ be a basis for $V$. The following are equivalent:
- The matrix of $T$ with respect to this basis is upper triangular.
- $T(v_j) \in \operatorname{span}(v_1, \dots, v_j)$ for each $j = 1, \dots, n$.
- $\operatorname{span}(v_1, \dots, v_j)$ is invariant under $T$ for each $j = 1, \dots, n$.
Theorem: Existence of Upper-Triangular Form
If $V$ is a finite-dimensional complex vector space and $T \in \mathcal{L}(V)$, then there exists a basis of $V$ such that the matrix of $T$ is upper triangular.
*(Note: This requires a complex vector space because operators on real vector spaces may not have eigenvalues.)*
Connection to Eigenvalues: If the matrix of $T$ is upper triangular, the eigenvalues of $T$ are exactly the entries on the diagonal. (In the example above, eigenvalues are 2, 5, and 8.)
3. Eigenspaces and Diagonal Matrices
Diagonal Matrix
A diagonal matrix is a square matrix where all entries off the diagonal are zero. (Every diagonal matrix is upper triangular.)
$$ \begin{bmatrix} 8 & 0 & 0 \\ 0 & 5 & 0 \\ 0 & 0 & 5 \end{bmatrix} $$If an operator has a diagonal matrix with respect to some basis, the diagonal entries are its eigenvalues.
Eigenspaces
For $\lambda \in F$, the eigenspace of $T$ corresponding to $\lambda$ is defined as:
$$ E(\lambda, T) = \operatorname{null}(T - \lambda I) $$This subspace contains all eigenvectors corresponding to $\lambda$, plus the zero vector.
If $\mathcal{M}(T)$ with respect to $v_1, v_2, v_3$ is the diagonal matrix above (entries 8, 5, 5):
- $E(8, T) = \operatorname{span}(v_1)$
- $E(5, T) = \operatorname{span}(v_2, v_3)$
Theorem: Sum of Eigenspaces
Let $V$ be finite-dimensional and let $\lambda_1, \dots, \lambda_m$ be distinct eigenvalues of $T$. Then:
- The sum of the eigenspaces is a direct sum: $$ E(\lambda_1, T) + \dots + E(\lambda_m, T) $$
- $\dim(E(\lambda_1, T)) + \dots + \dim(E(\lambda_m, T)) \le \dim V$.
This theorem implies that eigenvectors corresponding to distinct eigenvalues are linearly independent.
This concludes the summary of Part 2 & 3.
'수리과학 > Linear Algebra' 카테고리의 다른 글
| [Axler] Orthonormal Basis and Shur's Theorem (0) | 2025.12.13 |
|---|---|
| [Axler] Inner Product and Norm (0) | 2025.12.13 |
| [Axler] The Existence of Eigenvalue (0) | 2025.12.12 |
| [Axler] The Fundamental Theorem of Algebra (0) | 2025.12.12 |
| [Axler] The Fundamental Theorem of Linear Maps (0) | 2025.12.11 |