Eigenvectors and Upper-Triangular Matrices (Part 1)
Based on Linear Algebra Done Right by Sheldon Axler
This post covers the foundational concepts required to understand eigenvalues and eigenvectors, including operator polynomials, and presents the proof for the existence of eigenvalues over complex vector spaces.
1. Notation and Terminology
- $\mathbf{F}$ : Denotes either the real field $\mathbf{R}$ or the complex field $\mathbf{C}$.
- $V$ : A vector space over $\mathbf{F}$.
- Operator : A linear map from a vector space to itself ($T: V \to V$).
- $\mathcal{L}(V)$ : The set of all operators on $V$.
2. Powers of an Operator
For an operator $T \in \mathcal{L}(V)$ and a positive integer $m$:
- $T^m = \underbrace{T \circ \dots \circ T}_{m \text{ times}}$ (Composition of maps).
- $T^0 = I$ (Identity operator).
- If $T$ is invertible, $T^{-m} = (T^{-1})^m$.
Exponent Rules:
$T^m \circ T^n = T^{m+n}$
$(T^m)^n = T^{mn}$
3. Polynomials Applied to Operators
Let $p(z) = a_0 + a_1 z + \dots + a_m z^m$ be a polynomial with coefficients in $\mathbf{F}$. We define $p(T)$ as:
$$ p(T) = a_0 I + a_1 T + \dots + a_m T^m $$Let $P(\mathbf{R})$ be the space of real polynomials and let $D$ be the differentiation operator ($Dp = p'$).
If $p(x) = 7 - 3x + 5x^2$, then according to the definition:
$$ p(D) = 7I - 3D + 5D^2 $$ Applying this to a polynomial $q$: $$ p(D)(q) = 7q - 3q' + 5q'' $$
Algebraic Properties
The map $p \mapsto p(T)$ is a linear map from $P(\mathbf{F})$ to $\mathcal{L}(V)$. A crucial property is multiplicativity:
$$ (pq)(T) = p(T) \circ q(T) $$Corollary (Commutativity):
Any two polynomials in $T$ commute with each other.
$$ p(T) \circ q(T) = q(T) \circ p(T) $$
Since operator multiplication is generally not commutative, this property is very useful.
4. Existence of Eigenvalues
Theorem
Every operator on a finite-dimensional, nonzero, complex vector space has an eigenvalue.
Important Constraints
- Real Vector Spaces: False. Consider rotation by $90^\circ$ on $\mathbf{R}^2$ ($T(x, y) = (-y, x)$). It has no real eigenvalues because no non-zero vector is mapped to a scalar multiple of itself.
- Infinite-dimensional Spaces: False. Consider the multiplication operator on complex polynomials defined by $(Tp)(z) = z \cdot p(z)$.
Proof (Without Determinants)
Let $V$ be a complex vector space with $\dim V = n > 0$, and $T \in \mathcal{L}(V)$.
- Choose a nonzero vector $v \in V$.
- Consider the list of $n+1$ vectors: $(v, Tv, T^2v, \dots, T^n v)$.
- Since $\dim V = n$, this list is linearly dependent. Thus, there exist scalars $a_0, \dots, a_n \in \mathbf{C}$ (not all zero) such that: $$ a_0 v + a_1 Tv + \dots + a_n T^n v = 0 $$
- Let $p(z) = a_0 + a_1 z + \dots + a_n z^n$. By the Fundamental Theorem of Algebra, we can factor $p(z)$: $$ p(z) = c(z - \lambda_1) \dots (z - \lambda_m) $$
- Substituting $T$ for $z$, the equation becomes: $$ c(T - \lambda_1 I) \dots (T - \lambda_m I)v = 0 $$
- Since $c \ne 0$ and $v \ne 0$, the operator product applied to $v$ is zero. This implies that at least one of the factors, say $(T - \lambda_j I)$, is not injective.
- If $(T - \lambda_j I)$ is not injective, then its null space is non-trivial. Thus, $\lambda_j$ is an eigenvalue. $\blacksquare$
Most textbooks prove this using the characteristic polynomial $\det(\lambda I - T)$. This book avoids that approach to define eigenvalues directly from the geometry and structure of vector spaces, without relying on the complex definition of determinants early on.
'수리과학 > Linear Algebra' 카테고리의 다른 글
| [Axler] Inner Product and Norm (0) | 2025.12.13 |
|---|---|
| [Axler] Eigenvalues, Invariant Subspaces, Diagonal Matrix (0) | 2025.12.12 |
| [Axler] The Fundamental Theorem of Algebra (0) | 2025.12.12 |
| [Axler] The Fundamental Theorem of Linear Maps (0) | 2025.12.11 |
| [Axler] Duality (0) | 2025.12.11 |