Vector Spaces
Introduction: This exercise is the "bridge" between the algebra you learned in high school (solving for x and y) and the high level Vector Space theory you are mastering now. It proves that a system of equations is just a question about filling space.
Exercise (System of Linear Equations): Look at the system of
`m` linear equations in
`n` unknowns
`a_11 X_1 + a_12 X_2 + ... + a_1n X_n = b_1`
`a_21 X_1 + a_22 X_2 + ... + a_2n X_n = b_2`
.
.
`a_{m1} X_1 + a_{m2} X_2 + ... + a_{mn} X_n = b_m` , where `a_{ij} , b_i in bbb"F"`
a) Show that there exists a solution `X_1 , ... \ , X_n in bbb"F"` for all m equations , if and only if the vector `beta = (b_1 , ... \ , b_m )` of `bbb"F"^m` is a linear combination of the vectors `alpha_j =(a_{ij} , ... \ , a_{mj})`.
b) From part a) show if `n = m` and `{alpha_j \ "for" \ j = 1 , ... \ , n}` is a basis for `bbb"F"^n` , then the system always has a unique solution.
Reformulate the Exercise: We are translating a system of m linear equations into vector language.
- Part a: Prove that the system has a solution has if and only if the constant vector `beta` is "reachable" by the column vectors `alpha_j`.
- Part b: Prove that if the number of equations equals the number of unknowns `bb{(n = m)}` and the columns form a basis , there is exactly one unique solution for eny `beta`.
General Strategy: We will rewrite the entire list of equations as a single Vector Equation. This allows us to use the definition of Spanning (for existence) and Linear Independence (for uniqueness) that we just proved in the prevous exercise.
Specific Theory Used:
- Linear Combinations: A vector is a combination of `{alpha_1 , ... \ , alpha_n}` if `v = X_1 alpha_1 + ... + X_n alpha_n`.
- The "Unique Representation" Theorem: Every vector in a space has exactly one way to be written as a combination of basis vectors.
- Column Vectors: Viewing the coefficients of `X_j` across all `m` equations as a single vector `alpha_j in bbb"F"^m`.
Solution:
Part a: Existence. The system can be written as
`X_1 ((a_11), (a_21) , ( . ) , (a_{m1})) + X_2 ((a_12), (a_22) , ( . ) , (a_{m2})) + ... + X_n ((a_{1n}), (a_{2n}) , ( . ) , (a_{mn})) = ((b_1), (b_2) , ( . ) , (b_{m}))`
which is simplified to `X_1 alpha_1 + X_2 alpha _2 + ... + X_n alpha_n = beta`.
By definition , a "solution" is a set of scalars `(X_1 , ... \, X_n)` that makes this equality true. This is exactly the definition of `beta` being a linear combination of the vectors `alpha_j`. If `beta` is not in the "span" of these vectors , no such `X_j` exists , and the system has no solution.
Part b: Uniqueness. Let `n = m` and `{alpha_1 , ... \ , alpha_n}` be a basis for `bbb"F"^n`.
1. Existence: Since a basis spans `bbb"F"^n` , every possible vector `beta in bbb"F"^n` can be written as a linear combination of the `alpha_j`. Thus a solution always exists.
2. Uniqueness: We proved in our last exercise that the representation of a vector over a basis is unique. Therefoe there is only one possible set of scalars `(X_1 , ... \ , X_n)` that works.
QED
Question
If the system is homogenous , meaning `beta =(0, 0 , ... \ , 0)` , and the vectors `a_j` are a basis , how many solutions are there
?
A) Only one: `X_1 = X_2 = ... = X_n = 0`.
B) Infinitely many , because the vectors span the whole space.