Vector Spaces
Topic: `"Rank-Nullity Theorem"`.
Introduction: We have discussed how transformations "collapse" information (the Kernel) and "project" information (the Image). This theorem proves that these two actions are perfectly complementary. The total dimension of your domain `bbb"V"` is always partitioned into what is "lost" (the kernel) and what is "kept" (the image).
Note: `Rank = dim(phi[bbb"V"]) , "Nullity" = dim(Ker(phi))`.
Exercise: Let `bbb"V"` and `bbb"W"` be vector spaces over the same field `bbb"F"` , and let `bbb"V"` be finite dimensional over `bbb"F"`, where `dim(bbb"V")` is the dimension of the vector space `bbb"V"` over `bbb"F"`.
Let `phi: bbb"V" rarr bbb"W"` be a linear transformation.
(a) Show that `phi[bbb"V"]` is a subspace of of `bbb"W"`.
(b) Show that `dim(phi[bbb"V"]) = dim(bbb"V") - dim(Ker(phi))`.
Solution:
(a) Show that `phi[bbb"V"]` is a subspace of `bbb"W"`:
To show that `phi[bbb"V"]` is a subspace of `bbb"W"` , we must verify the standard subspace criteria for the set `im(phi) = {phi(v) : v in bbb"V"}`:
1. Contains the Zero Vector: Since `phi(0_{bbb"V"}) = 0_{bbb"W"}` , then `0_{bbb"W"} in phi[bbb"V"]`.
To se this:
Let `phi: bbb"V" rarr bbb"W"` be linear , so for all `u , v in bbb"V"` , `phi(u + v) = phi(u) + phi(v)` and `phi(lamda v) = lamda phi(v)`.
Take any `v in bbb"V"`. Then `0_{bbb"V"} = v - v`.
Apply `phi: phi(0_{bbb"V"}) = phi(v - v) = phi(v) + phi(-v) = phi(v) - phi(v) = 0_{bbb"W"}`.
2. Closure under addition: If `w_1 , w_2 in phi[bbb"V"]` , there exists `v_1 , v_2` such that `phi(v_1) = w_1` and `phi(v_2) = w_2`. Then `w_1 + w_2 = phi(v_1) + phi(v_2) = phi(v_1 + v_2)`. Since `v_1 + v_2 in bbb"V"` , the sum is in `phi[bbb"V"]`.
3. Closure under Scaling: For any `c in bbb"F"` and `w = phi(v)` , then `cw = c phi(v) = phi(cv)`. Since `cv in bbb"V"` , the result is in `phi[bbb"V"]`.
(b) Show that `dim(phi[bbb"V"]) = dim(bbb"V") - dim(ker(phi))`.
1. Let `{u_1 , ... \ , u_k}` be a basis for `ker(phi)` , where `k = dim(ker(phi))`.
2. By the Basis Extension Theorem , we can extend this to a basis for the whole space `bbb"V"`:
`{u_1 , ... , u_k \ , v_1 , ... , v_m}` where `k + m = n = dim(bbb"V")`.
3. We claim that the set `{phi(v_1) , ... , phi(v_m)}` is a basis for `phi[bbb"V"]`.
- Spanning: Any `w in phi[bbb"V"]` is `phi(v)` for some `v = sum a_i u_i + sum b_j v_j`. Since `phi(u_i) = 0 , phi(v) = sum b_j phi(v_j)`.
- Linear Independence: If `sum c_j phi(v_j) = 0` , then `phi(sum c_j v_j) = 0` , meaning `sum c_j v_j in ker(phi)`. This implies `sum c_j v_j = sum d_i u_i`. Because the original basis `{u_i , v_j}` is linearly independent , the result is a contradiction if not all `c_i` are `0` , because if not then the `v_j` would be dependent. So, all `c_i` must be `0`.
Result: Thus , `dim(im(phi)) = m = n - k`.
Substituting dimensions : `dim (phi[bbb"V"]) = dim(bbb"V") - dim(ker(phi))`.
QED
Transition Note: : `"The Lie Algebra Minute"`.
In our next Lie Algebra Minute exercise , we will apply this accounting rule to the adjoint transformations `ad_e \ , ad_f \ ,` and `ad_h`. We know `dim (sl(2 , RR)) = 3`. We will check if `dim (Ker) + dim("Image")` equals `3` for each of them.
END
Question
If you have a linear transformation `phi: bbb"V" rarr bbb"W"` where `bbb"V"` is a `5`- dimensional space and the transformation "collapses" `2` dimensions into the Kernel , what is the dimension of the Image (the Rank)
?
A) `3` , because the Rank must satisfy `"Rank + Nullity" = dim(bbb"V")`.
B) `7` , because the transformation must project into a larger space to compensate for the Kernel.