Vector Spaces

Definition: Linear Independence

A set of vectors, $\vec{v}_1, \vec{v}_2, \ldots, \vec{v}_n$ are linearly independent if none of the vectors can be expressed as a linear combination of the remaining $n-1$ vectors.

An alternative definition is that if $c_1 \vec{v}_1 + c_2 \vec{v}_2 + \ldots c_n \vec{v}_n = \vec{0}$, then the only set of values for $c_i$ which satisfies this, are $c_1 = c_2 = \ldots = c_n =0$. That is, if the column vectors of $A$ are linearly independent, then the only solution to $A\vec{x}=\vec{0}$ is $\vec{x} =\vec{0}$.

Thus a matrix $A$ has linearly independent columns if and only if the equation $A\vec{x} = \vec{0}$ has exactly one solution.

If the columns of $A$ are not linearly independent, then $A\vec{x} =\vec{0}$ will have infinitely many solutions.

Definition: Vector Spaces

A vector space is a set $V$ with two operations:

  • Addition of the elements of $V$, i.e. if $\vec{v}$, $\vec{u} \in V$, then $\vec{v} + \vec{u} \in V$
  • Multiplication of the elements of $V$ by a scalar, i.e. if $\vec{v} \in V$, and $\alpha \in \mathbb{R}$ then $\alpha\vec{v}~\in~V$,

which satisfies the following conditions:

  1. $\vec{v} + \vec{u} = \vec{u} + \vec{v}$
  2. $\vec{u} + \left(\vec{v} + \vec{w} \right) =\left(\vec{u} + \vec{v}\right) + \vec{w}$
  3. There exists a vector $\vec{0} \in V$ such that $ \vec{u} + \vec{0} = \vec{u}$ for all $\vec{u} \in V$
  4. There exists a vector $\vec{1} \in V$ such that $ \vec{u}\vec{1} = \vec{u}$ for all $\vec{u} \in V$
  5. For any vector $\vec{v} \in V$, there exists a vector $\vec{u} \in V$ such that $\vec{v} + \vec{u} = \vec{0}$, which is denoted as $\vec{u} = - \vec{v}$
  6. For any $a$, $b \in \mathbb{R}$ and $\vec{v} \in V$, then $a\left(b \vec{v} \right) = \left( a b\right)\vec{v}$
  7. $a\left( \vec{v} + \vec{u} \right) = a \vec{v} + a \vec{u}$
  8. $\left( a + b \right)\vec{v} = a \vec{v} + b \vec{v}$
Definition: Subspaces
If $V$ is a vector space and $W \subset V$ and $W$ is also a vector space, then it is called a subspace of $V$.
Definition: Span
If $\mathcal{A} = \left\{ \boldsymbol{v}_1, \ldots, \boldsymbol{v}_k \right\}$ where each vector $\boldsymbol{v}_i \in\mathbb{R}^n$, then the span of $\mathcal{A}$ is the set of all possible linear combinations of the vectors in $\mathcal{A}$.
Definition: Basis
A basis of a vectors space is the maximal collection of linearly independent vectors from that vector space.

Thus, if you add another vector from the vectors space to the basis set, it will be a linear combination of the vectors from the basis.

The number of vectors in the basis is the dimension of the vector space.

Definition: Column and Row Spaces
The column space of a matrix $A$ is the span of all columns of $A$. Similarly, the row space is the span of the rows of $A$ or the column space of $A^T$.
Definition: Null spaces
The null space of a matrix $A$ is the collection of all solutions to $A \boldsymbol{x} = \boldsymbol{0}$.
Definition: Rank
The rank of a matrix $A$ is the dimension of the column space of $A$.

The rank is also the number of pivots in $A$ when performing Gaussian elimination.

Definition: Projections

The projection of a vector $\boldsymbol{v}$ onto a nonzero vector $\boldsymbol{u}$ is given by

$$ \textrm{proj}_{\boldsymbol{u}} \left(\boldsymbol{v} \right) = \dfrac{\boldsymbol{v} \cdot \boldsymbol{u} }{\boldsymbol{u} \cdot \boldsymbol{u} }\boldsymbol{u}. $$
Definition: Gram-Schmidt

Given $k$ vectors $\boldsymbol{v_1}, \ldots, \boldsymbol{v_k}$ the Gram–Schmidt process defines the vectors $\boldsymbol{u_1}, \ldots, \boldsymbol{u_k}$ as follows:

$$ \begin{align*} \boldsymbol{u_1} & = \boldsymbol{v_1} \\ & \vdots \\ \boldsymbol{u_k} & = \boldsymbol{v_k} - \sum_{j=1}^{k-1} \textrm{proj}_{\boldsymbol{u_j}} \left(\boldsymbol{v_k} \right). \end{align*} $$

The set of vectors $\boldsymbol{u_k}$ are orthogonal. Normalizing the vectors as $\boldsymbol{e_j} = \dfrac{\boldsymbol{u_j}}{ \left\| \boldsymbol{u_j} \right\|}$ is a set of orthornormal vectors.