page2070      Vector Spaces      Matthias Lorentzen...mattegrisenforlag.com


Look at the picture beneath, then scroll down to the question and click the correct Answer button.

Vector Spaces

Topic: Construction of the `"Quotient Space"`.

Introduction: This exercise marks a shift from looking at the "insides" of the space to "grouping" its elements together. It is one of the more abstract concepts in linear algebra because it requires you to treat an entire subspace as if it were a single "zero" point. In prevous exercises , we saw how a linear transformation can "collaps" a kernel to zero. A Quotient Space is the intricinic of that collapse. If we have a subspace `bbb"S"`, we can "divide" `bbb"V"` by `bbb"S"` to create a new , smaller space. In this new space , every vector in `bbb"S"` is considered equalent to the zero vector. We aren't just looking at vectors anymore , we are looking at cosets.

Exercise: Let `bbb"V"` be a vector space over a field `bbb"F"` , and let `bbb"S"` be a subspace of `bbb"V"`. Define the quotient space `bbb"V/S"` , and show that it is a vector space over `bbb"F"`.

Solution:
1. Defining the Elements (Cosets): For any vector `v in bbb"V"` , the coset of `v` modulo `bbb"S"` is the set:
`v + bbb"S" = {v + s \ : s in bbb"S"}`.
The Quotient Space `bbb"V/S"` is the set of all such cosets : `bbb"V/S" = {v + bbb"S" \ : v in bbb"V"}`.

2. Defining the operations: To make `bbb"V/S"` a vector space , we define addition and scalar multiplication on the cosets:
- Addition: `(u + bbb"S") + (v + bbb"S") = (u + v) + bbb"S"`.
- Scalar Multiplication: `a(v + bbb"S") = (av) + bbb"S"`.

3. Showing it is a Vector Space: To prove this is a vector space , we must ensure these operations are `"well defined"` (the result doesen't change if we pick a different representative for the coset) and that the 8 axioms hold:
- Well defined: If `u + bbb"S" = u' + bbb"S"` , then `u - u' in bbb"S"`. Since `bbb"S"` is a subspace , any linear combination of `u` and `v` remains consistent within the "cloud" of `bbb"S"` (explanation beneath).
- Zero Vector: The zero vector of `bbb"V/S"` is the coset `0 + bbb"S"` , which is just the subspace itself.
- Axioms: Since addition and scaling in `bbb"V/S"` are derived directly from the operations in `bbb"V"` , all properties (associativity , comutativity , etc.) are inherited from `bbb"V"`.

Result: `bbb"V/S"` is a valid vector space. Its dimension is `dim (bbb"V") - dim(bbb"S")`. It effectively "subtracts" the dimension `bbb"S"` from `bbb"V"`.
QED

Explanation: In a quotient space `bbb"V/S"` , cosets `u + bbb"S"` are the elements , and "well-defined" means that any `"operation"` (like addition , scalar multiplication , or a map defined on cosets) does not depend on which representative `u` you choose from the coset.

`"Equality of cosets"`: By definition ,
`u + bbb"S" = u' + bbb"S" hArr u - u' in bbb"S"`.
This does not mean `u = u'` , it only means they differ by a vector in `bbb"S"`. So , many different elements `u` can represent the same coset.

`"What well defined means here"`:
Suppose you define an operation on cosets , for example
`(u + bbb"S") + (v + bbb"S") := (u + v) + bbb"S"`.
If you pick different representatives `u' in u + bbb"S"` and `v' in v + bbb"S"` (so `u' = u + s_1 , v' = v + s_2` with `s_1 , s_2 in bbb"S"`) , then the result is the same coset : `(u' + v') + bbb"S" = (u + v) + bbb"S"`.

`"Compute to prove well defined"`:
`(u' + v') - (u + v) = (u + s_1 + v + s_2) - (u + v) = s_1 + s_2 in bbb"S"` , since `bbb"S"` is a subspace and hence closed under addition.
Therefore `(u' + v') + bbb"S" = (u + v) + bbb"S"` , so the operation does not depend on the choice of representatives. That is what "well defined" is checking in this context.

`"Interpreting the cloud of S sentence"` :
The sentence "since `bbb"S"` is a subspace , any linear combination of `u` and `v` remains consistent within the `"cloud"` of `bbb"S"`" is informal language for the fact that:
- If you replace `u` by `u + s_1` and `v` by `v + s_2` with `s_1 , s_2 in bbb"S"` ,
- then any linear combination like `au + bv` changes only by an element of `bbb"S"` , because
`a(u + s_1) + b(v + s_2) = au + bv + (as_1 + bs_2)` , and `as_1 + bs_2 in bbb"S"` as `bbb"S"` is a subspace. So all such linear combinations stay in the same coset (same "cloud") in `bbb"V/S"`. Think of

`"Think of" \ bbb"V/S" \ "exactly like residue classes mod n"` :
- Two vectors are "the same" in the quotient if their difference lies in `bbb"S"`.
1. Recap of cosets:
`u + bbb"S" = {u + s \ : s in bbb"S"}`.
`u + bbb"S" = u' + S hArr u - u' in bbb"S"`.
- In particular both `u` and `u'` lie in the same coset `u + bbb"S"`: `u = u + 0 in u + bbb"S"` , and if `u - u' in bbb"S"` then `u' = u + (u' - u) in u + bbb"S"`. So `u` is in `u + bbb"S"` , and if `u + bbb"S" = u' + bbb"S"` then `u'` is in that same coset as well.

2. Analogy with residues modulo n:
For integers `a equiv b \ (mod n)` means `a - b` is a multiple of `n`. Here , `u` ~ `u'` means `u - u'` is an element of `bbb"S"`.
- "Multiple of n" `harr` "element of subspace `bbb"S"`".
- A coset `u + bbb"S"` is `u` `"modulo"` `bbb"S"`, just like the residue class of `u` `"modulo"` `n`.
There is no "multiple of S" as such , instead `bbb"S"` itself plays the role of `n ZZ`: you collaps `"everything"` in `bbb"S"` to zero in the quotient.

3. Does `u + bbb"S" = u' + bbb"S" rArr u - u' in bbb"S"` already give give well definedness ?
Not by itself , that condition is just the definition of when two cosets are equal (or the basis property you prove about them). "Well defined" is about operations or maps on cosets.

`"Example: addition on" \ bbb"S"`.
You define `(u + bbb"S") + (v + bbb"S") := (u + v) + bbb"S"`. To show this is well defined , you must show: If `u + bbb"S" = u' + bbb"S"` and `v + bbb"S" = v' + bbb"S"` , then `(u + v) + bbb"S" = (u' + v') + bbb"S"`.
Using our condition:
`u + bbb"S" = u' + bbb"S" rArr u - u' in bbb"S"`.
`v + bbb"S" = v' + bbb"S" rArr v - v' in bbb"S"`.
- Since `bbb"S"` is a subspace , `(u + v) - (u' + v') = (u - u') + (v - v') in bbb"S"`.
- Therefore `(u + v) + bbb"S" = (u' + v') + bbb"S"`.
That last step is exactly the `"well defined check"`: it shows the result does not depend on which representatives `u , v` you choose from the cosets. So:
`u + bbb"S" = u' + bbb"S" rArr u - u' in bbb"S"` is `"the equivalens relation/coset equality statement"`.
- To prove an operation (or map) is well defined , you use that statement plus the subspace properties of `bbb"S"` (closure under addition and scalar multiplication) to show that changing representatives doesn't change the resulting coset.

`"Why is"` `dim(bbb"V/S") = dim(bbb"V") - dim(bbb"S")?`
Think of `bbb"V/S"` as "forgetting" all directions that lie in `bbb"S"`. The formula `dim(bbb"V/S") = dim(bbb"V") - dim(bbb"S")` says:
once you declare all of `bbb"S"` to be zero , you only keep the directions that were not already in `bbb"S"`.

`"Start with a basis adaptet to"` `bbb"S"`: Assume everything is finite dimensional. Pick a basis of `bbb"S"` ,
`(s_1 , ... \ , s_k)`.
Extend it to a basis of `bbb"V"` ,
`(s_1 , ... \ , s_k \ , v_{k + 1} , ... \ , v_n)`.
So: `dim(bbb"S") = k` , `dim(bbb"V") = n` , the "extra" basis vectors `v_{k + 1} , ... \ , v_n` are the directions not contained in `bbb"S"`.

`"What happens in the quotient" \ bbb"V/S" \ ?`
In `bbb"V/S"` the coset itself is the zero element. That means
- Each `s_i in bbb"S"` satisfies `s_i + bbb"S" = bbb"S"` , so all `s_i` becomes the same element: the zero of `bbb"V/S"`.
- Geometrically : all vectors in `bbb"S"` are "collapsed" to a single point.
Thus the basis of `bbb"S"` does not give non zero directions in the quotient , they are all absorbed into the zero coset.

`"Which basis vectors survive ?"`
Look at the cosets of the remaining basis vectors:
`v_{k + 1} + bbb"S" , ... \ , v_n + bbb"S"`.
1. Spanning: Any vector `v in bbb"V"`can be written
`v = sum_{i = 1}^k a_i s_i + sum_{j = k + 1}^n b_j v_j`. In the quotient:
`v + bbb"S" = sum_{i = 1}^k a_i s_i + sum_{j = k + 1}^n b_j v_j + bbb"S" = sum_{j = k + 1}^n b_j (v_j + bbb"S")` , because `sum a_i s_i in bbb"S"` becomes zero in `bbb"V/S"`. So every coset is a linear combination of `v_{k + 1} + bbb"S" , ... \ v_n + bbb"S"`. They span `bbb"V/S"`.

`"Why is" sum_{j = k + 1}^n b_j v_j + bbb"S" = sum_{j = k + 1}^n b_j (v_j + bbb"S") ?`
Write the key step as one use of the quotient map
`pi : bbb"V" rarr bbb"V/S"` , `pi(v) = v + bbb"S"`.
Let `v = sum_{i = 1]^k a_i s_i + sum_{j = k + 1}^n b_j v_j` , with `s_i` a basis of `bbb"S"` and
`(s_1 , ... \ , s_k \ , v_{k + 1} , ... \ , v_n)` a basis of `bbb"V"`.
Apply `pi` and use linearity:
`pi(v) = pi(sum_{i = 1]^k a_i s_i + sum_{j = k + 1}^n b_j v_j) = sum_{i = 1}^k a_i pi(s_i) + sum_{j = k + 1}^n b_j pi (v_j)`.
Now identify each term:
- `pi(s_i) = s_i + bbb"S" = bbb"S"` (zero coset) , since `s_i in bbb"S"`.
- `pi(v_j) = v_j + bbb"S"`.
So `sum_{i = 1}^k a_i pi(s_i) = sum_{i = 1}^k a_i bbb"S" = bbb"S"` is just the zero element in `bbb"V/S"` , and we get `v + bbb"S" = pi(v) = bbb"S" + sum_{j = k + 1}^n b_j (v_j + bbb"S") = sum_{j = k + 1}^n b_j (v_j + bbb"S")` , because `bbb"S"` acts as a zero in `bbb"V/S"`.
This is the exact point where `b_j` are "freed" : linearity of `pi` lets you pull the scalars out , and the S-part disapears because it becomes zero in the quotient , leaving the same coefficients `b_j` now attached to the cosets `v_j + bbb"S"`.

2. Linear independnce: If `sum_{j = k + 1}^n c_j(v_j + bbb"S") = bbb"S"` (zero coset) then `(sum_{j = k + 1}^n c_j v_j) + bbb"S" = bbb"S"` , so `sum_{j = k + 1}^n c_j v_j in bbb"S"`. But the original list `(s_1 , ... \ , s_k , v_{k + 1} , ... \ , v_n)` is a basis of `bbb"V"` , so the only way a combination of the `v_j` lies in `bbb"S"` is with all `c_j = 0`. Hence `v_{k + 1} + bbb"S" , ... \ , v_n + bbb"S")` is a basis of `bbb"V/S"` . so `dim(bbb"V/S") = n - k = dim(bbb"V") - dim(bbb"S")`.

The "surviving" basis vectors `v_{k + 1} , ... \ , v_n` represent distinct cosets , if two of them landed in the same coset , their difference would lie in `bbb"S"` , contradicting the original list is a basis. To see this , take two different indices `i ne j` with `k + 1 le i , j le n`. Assume for contradiction , that these two basis elements of the quotient lie in the same coset:
`v_i + bbb"S" = v_j + bbb"S"`.
By definition of coset equality , this means `v_i - v_j in bbb"S"`. Since `bbb"S" = "span" {s_1 , ... \ , s_k}` , we can write `v_i - v_j = a_1 s_1 + ... + a_k s_k` for some scalars `a_1 , ... \ a_k`.
Rearrange: `v_i - v_j - a_1 s_1 - ... - a_k s_k = 0`.
Now look at this as a linear combination of the basis of `bbb"V"` , so the rest of the basis vectors of `bbb"V"` have coefficient `0`. Since `(s_1 , ... \ , s_k , v_{k + 1} , ... \ , v_n)` is a basis of `bbb"V"` , it is linearly independent. Therefore the only way a linear combination of these vectors can give `0` is if all coefficients are `0`. We must have in this case that `1 = 0` (coefficient of `v_i`) , which is a contradiction. END

Transition Note: The Lie Algebra Munute:
This "Quotient" idea is exactly what happens when we look at the Image of an adjoint map. When we apply `ad_e` , the kernel (the `e` direction) is "ignored" or "zeroed out". The resulting `2D` image is structurally very similar to the quotient space `sl(2 , RR)"/"ker(ad_x)`. In our next Lie Algebra exercise , we will see if we can identify this quotient space geometrically using our `x , y , z` axis model!

Question

In the quotient space `bbb"V/S"` , what does it mean if two different vectors `u` and `v` from the original space `bbb"V"` result in the same coset (i.e. `u + bbb"S" = v + bbb"S"`) ?

A) It means that `u` and `v` are actually the same vector in `bbb"V"`.

B) It means that the difference between them `(u + v)` lies entirely within the subspace `bbb"S"`.