Vector Spaces
A set of vectors, $\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n$ are linearly independent if none of the vectors can be expressed as a linear combination of the remaining $n-1$ vectors.
An alternative definition is that if $c_1 \vec{v}_1 + c_2 \vec{v}_2 + \ldots c_n \vec{v}_n = \vec{0}$, then the only set of values for $c_i$ which satisfies this, are $c_1 = c_2 = \ldots = c_n =0$. That is, if the column vectors of $A$ are linearly independent, then the only solution to $A\vec{x}=\vec{0}$ is $\vec{x} =\vec{0}$.
Thus a matrix $A$ has linearly independent columns if and only if the equation $A\vec{x} = \vec{0}$ has exactly one solution.
If the columns of $A$ are not linearly independent, then $A\vec{x} =\vec{0}$ will have infinitely many solutions.
A vector space is a set $V$ with two operations:
- Addition of the elements of $V$, i.e. if $\vec{v}$, $\vec{u} \in V$, then $\vec{v} + \vec{u} \in V$
- Multiplication of the elements of $V$ by a scalar, i.e. if $\vec{v} \in V$, and $\alpha \in \mathbb{R}$ then $\alpha\vec{v}~\in~V$,
which satisfies the following conditions:
- $\vec{v} + \vec{u} = \vec{u} + \vec{v}$
- $\vec{u} + \left(\vec{v} + \vec{w} \right) =\left(\vec{u} + \vec{v}\right) + \vec{w}$
- There exists a vector $\vec{0} \in V$ such that $ \vec{u} + \vec{0} = \vec{u}$ for all $\vec{u} \in V$
- There exists a vector $\vec{1} \in V$ such that $ \vec{u}\vec{1} = \vec{u}$ for all $\vec{u} \in V$
- For any vector $\vec{v} \in V$, there exists a vector $\vec{u} \in V$ such that $\vec{v} + \vec{u} = \vec{0}$, which is denoted as $\vec{u} = - \vec{v}$
- For any $a$, $b \in \mathbb{R}$ and $\vec{v} \in V$, then $a\left(b \vec{v} \right) = \left( a b\right)\vec{v}$
- $a\left( \vec{v} + \vec{u} \right) = a \vec{v} + a \vec{u}$
- $\left( a + b \right)\vec{v} = a \vec{v} + b \vec{v}$
Thus, if you add another vector from the vectors space to the basis set, it will be a linear combination of the vectors from the basis.
The number of vectors in the basis is the dimension of the vector space.
The rank is also the number of pivots in $A$ when performing Gaussian elimination.
The set of vectors $\boldsymbol{u_k}$ are orthogonal. Normalizing the vectors as $\boldsymbol{e_j} = \dfrac{\boldsymbol{u_j}}{ \left\| \boldsymbol{u_j} \right\|}$ is a set of orthornormal vectors.