[Linear Algebra] Key Concepts Summary

Based on Linear Algebra Done Right by Sheldon Axler

A summary of core concepts covering Invertibility, Products, Quotients, Duality, and Eigenvalues.


1. Invertibility and Isomorphic Vector Spaces

  • Invertible Linear Map: A linear map $T: V \to W$ is invertible if there exists a linear map $S: W \to V$ such that $ST = I_V$ and $TS = I_W$.
  • Condition for Invertibility: $T$ is invertible if and only if $T$ is both injective (one-to-one) and surjective (onto).
  • Isomorphism: Two vector spaces $V$ and $W$ are called isomorphic ($V \cong W$) if there exists an invertible linear map between them.
  • Dimension & Isomorphism: Finite-dimensional vector spaces $V$ and $W$ are isomorphic if and only if $\dim V = \dim W$.
  • Operators: For a map $T: V \to V$ on a finite-dimensional space, $T$ is injective $\iff$ $T$ is surjective $\iff$ $T$ is invertible.

2. Products of Vector Spaces

  • Product Space: $V_1 \times \dots \times V_m = \{(v_1, \dots, v_m) : v_i \in V_i\}$.
  • Dimension: $\dim(V_1 \times \dots \times V_m) = \dim V_1 + \dots + \dim V_m$.
  • Direct Sums: A sum $U_1 + \dots + U_m$ is a direct sum ($U_1 \oplus \dots \oplus U_m$) if and only if the map $\Gamma: U_1 \times \dots \times U_m \to U_1 + \dots + U_m$ defined by $\Gamma(u_1, \dots, u_m) = u_1 + \dots + u_m$ is injective.
  • Criterion: The sum is direct if $\dim(U_1 + \dots + U_m) = \dim U_1 + \dots + \dim U_m$.

3. Quotients of Vector Spaces

  • Affine Subset: $v + U = \{v + u : u \in U\}$. Represents a subspace translated by $v$.
  • Quotient Space: $V/U = \{v + U : v \in V\}$.
  • Dimension: $\dim(V/U) = \dim V - \dim U$.
  • Quotient Map: The map $\pi: V \to V/U$ defined by $\pi(v) = v + U$ is linear.
  • Isomorphism Theorem: $V/(\operatorname{null} T) \cong \operatorname{range} T$.

4. Dual Bases and Dual Maps

  • Linear Functional: A linear map $T: V \to \mathbf{F}$.
  • Dual Space ($V'$): The set of all linear functionals on $V$. Note that $\dim V' = \dim V$.
  • Dual Basis: For a basis $v_1, \dots, v_n$ of $V$, the dual basis $\varphi_1, \dots, \varphi_n$ satisfies $\varphi_j(v_k) = \delta_{jk}$.
  • Dual Map ($T'$): For $T: V \to W$, defined as $T'(\varphi) = \varphi \circ T$.

5. Annihilators and Matrix of Dual Map

  • Annihilator ($U^0$): $\{\varphi \in V' : \varphi(u) = 0 \text{ for all } u \in U\}$.
  • Dimension Formula: $\dim U + \dim U^0 = \dim V$.
  • Null/Range Relationships:
    • $\operatorname{null} T' = (\operatorname{range} T)^0$
    • $\operatorname{range} T' = (\operatorname{null} T)^0$
  • Transpose Matrix: The matrix of the dual map $T'$ is the transpose of the matrix of $T$. $\mathcal{M}(T') = (\mathcal{M}(T))^t$.

6. Polynomials

  • Fundamental Theorem of Algebra: Every non-constant polynomial with complex coefficients has a root in $\mathbb{C}$.
  • Factorization:
    • Over $\mathbb{C}$: Unique factorization into linear terms.
    • Over $\mathbb{R}$: Factorization into linear terms and quadratic terms (with discriminant < 0).
  • Operators: $P(T)$ is the operator formed by replacing $z$ with $T$ in the polynomial.

7 & 8. Eigenvalues and Invariant Subspaces

  • Invariant Subspace: $U$ is invariant under $T$ if $Tu \in U$ for all $u \in U$.
  • Eigenvalue/Vector: $Tv = \lambda v$ (with $v \ne 0$).
  • Existence: $\lambda$ is an eigenvalue $\iff T - \lambda I$ is not injective.
  • Theorem: Every operator on a finite-dimensional, non-zero complex vector space has at least one eigenvalue.

Points of Clarification (의문점 정리)

1. Kernel (null T) vs. Annihilator ($U^0$)

Kernel ($\operatorname{null} T$):

  • Related to a Linear Map $T$.
  • Lives in the Original Space $V$.
  • $\operatorname{null} T = \{v \in V : Tv = 0\}$.

Annihilator ($U^0$):

  • Related to a Subspace $U$.
  • Lives in the Dual Space $V'$.
  • $U^0 = \{\varphi \in V' : \varphi(u) = 0 \text{ for all } u \in U\}$.

2. Affine Subsets

Definition: An affine subset is of the form $v + U = \{v + u : u \in U\}$.

  • Intuition: A subspace $U$ shifted by a vector $v$. It's like a plane not passing through the origin.
  • Application: The solution set of $Ax = b$ is an affine subset $p + \operatorname{null} A$, where $p$ is a particular solution.

3. Quotient Maps & Isomorphism

Quotient Space ($V/U$): Intuitively "collapses" the subspace $U$ to zero.

Quotient Map ($\pi$): Defined by $\pi(v) = v + U$. It is always surjective with kernel $U$.

First Isomorphism Theorem: Connects the algebraic structure to geometric quotients.

$$ V / (\operatorname{null} T) \cong \operatorname{range} T $$

The Fundamental Theorem of Algebra

Based on Linear Algebra Done Right by Sheldon Axler (Chapter 4)

This theorem is the bedrock required to prove that every operator on a complex vector space has an eigenvalue. Axler provides an elementary proof using basic analysis rather than advanced complex analysis.


1. The Statement

Fundamental Theorem of Algebra

Every non-constant polynomial with complex coefficients has a root.

i.e., If $p \in \mathcal{P}(\mathbf{C})$ is a non-constant polynomial, then there exists $z \in \mathbf{C}$ such that $p(z) = 0$.


2. Axler's Proof Strategy

The proof relies on two main steps. We assume $p(z)$ does not have a root and derive a contradiction.

Step 1: Existence of a Minimum

First, we show that $|p(z)|$ attains a minimum value somewhere in $\mathbf{C}$.

  • As $|z| \to \infty$, $|p(z)| \to \infty$. (The highest degree term dominates).
  • This implies that $|p(z)|$ is large outside of some large closed disk $D$.
  • Since $p$ is continuous and the disk $D$ is compact, $|p(z)|$ attains a minimum on $D$.
  • Combined, $|p(z)|$ has a global minimum at some point $z_0 \in \mathbf{C}$.

Step 2: The Contradiction

Suppose $p(z_0) \ne 0$. We will show that there exists a point nearby where the absolute value is even smaller, contradicting that $z_0$ is the minimum.

The Logic:
Shift the polynomial to the origin by defining $q(z) = p(z + z_0)$. Then $|q|$ has a minimum at $0$, and $q(0) = p(z_0) \ne 0$.

Write $q(z)$ as: $$ q(z) = c + c_k z^k + \dots + c_m z^m $$ where $c = q(0)$ and $c_k$ is the first non-zero coefficient after the constant term.

For very small $z$, the $c_k z^k$ term dominates the higher order terms. By choosing the direction (argument) of $z$ correctly, we can make $c_k z^k$ point in the opposite direction of $c$.

This creates a value $q(z)$ such that: $$ |q(z)| < |c| = |q(0)| $$ This contradicts the fact that the minimum is at $0$. Therefore, $p(z_0)$ must be $0$.

3. Important Corollaries

This theorem leads to the unique factorization of polynomials.

Factorization over $\mathbf{C}$

If $p \in \mathcal{P}(\mathbf{C})$ is a non-constant polynomial, then $p$ can be uniquely factored (up to order) as:

$$ p(z) = c(z - \lambda_1) \dots (z - \lambda_m) $$

where $c, \lambda_1, \dots, \lambda_m \in \mathbf{C}$.

Relevance to Linear Algebra:
This factorization ensures that for any operator $T$ on a complex vector space, the characteristic equation (or minimal polynomial) has roots. Thus, eigenvalues always exist in finite-dimensional complex vector spaces.

+ Recent posts