Linear Equations

Linear Equations

Definition: Systems of Linear Equations

A system of linear equations (or linear system) is a collection of one or more linear equations involving the same variables. If there are $m$ equations with $n$ unknown variables to solve for

$$ \begin{align*} a_{1,1} x_1 + a_{1,2} x_2 + \ldots + a_{1,n} x_n & = b_1 \\ a_{2,1} x_1 + a_{2,2} x_2 + \cdots + a_{2,n} x_n & = b_2 \\ \vdots & \\ a_{m,1} x_1 + a_{m,2} x_2 + \cdots + a_{m,n} x_n & = b_m. \end{align*} $$

The system of linear equations can be written in matrix form ${A\boldsymbol{x}=\boldsymbol{b}}$, where

$$ \begin{align*} A = \left( \begin{array}{cccc} a_{1,1} & a_{1,2} & \cdots & a_{1,n} \\ a_{2,1} & a_{2,2} & \cdots & a_{2,n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{m,1} & a_{m,2} & \cdots & a_{m,n} \end{array} \right), \quad x = \left( \begin{array}{c} x_1 \\\ x_2 \\\ \vdots \\\ x_n \end{array} \right), \quad b = \left( \begin{array}{c} b_1 \\ b_2 \\ \vdots \\ b_m \end{array} \right), \end{align*} $$

so that $A \in \mathbb{R}^{m \times n}$, $\boldsymbol{x}= \in \mathbb{R}^n$ and $\boldsymbol{b}= \in \mathbb{R}^m$.

Direct Methods

Algorithm: Gaussian Elimination

Gaussian elimination is a method to solve systems of linear equations based on forward elimination (a series of row-wise operations) to convert the matrix, $A$, to upper triangular form (echelon form), and then back substitution to solve the system. The row operations are:

  • row swapping
  • row scaling, i.e. multiplying by a non-zero scalar
  • row addition, i.e. adding a multiple of one row to another

Forward elimination is written as

1  For $k=1$ to $n-1$:
2  For $i = k + 1$ to $n$:
3  For $j = k$ to $n$:
4  $a_{i,j} = a_{i,j} - \dfrac{a_{i,k}}{a_{k,k}} a_{k,j}$
5  End
6  $b_{i} = b_{i} - \dfrac{a_{i,k}}{a_{k,k}} b_{k}$
7  End

Then back substitute, starting with the last unknown first:

8  Set $x_n = \dfrac{b_n}{a_{n,n}}$
9  For $i$ =$n-1$ to $1$:
10 $y = b_i$
11 For $j = n$ to $i+1$:
12 $y = y - a_{i,j} x_j$
13 End
14 $x_i = \dfrac{y}{a_{i,i}}$
15 End

Algorithm: Gaussian Elimination with Scaled Partial Pivoting

A pivot element is the element of a matrix, $A$, which is selected to do certain calculations first. Pivoting helps reduce errors due to rounding.

1  Find maximal absolute values vector $\boldsymbol{s}$ with entries $s_i= \max_{j=1, \ldots, n} \left| a_{i,j} \right|$
2  For $k=1$ to $n-1$:
3  For $i = k$ to $n$:
4  Compute $\left| a_{i,k} / s_i \right|$
5  End
6  Find the row with the largest relative pivot element and denote this as row $j$
7  Swap rows $k$ and $j$
8  Swap entries $k$ and $j$ in vector $\boldsymbol{s}$
9  Do forward elimination on row $k$
10 End

The matrix will now be in row-echelon form, so that back substitution can be performed.

Theorem: $LU$-Decomposition

Let ${A \in \mathbb{R}^{n \times n}}$ be invertible. Then there exists a decomposition of $A$ such that ${A=LU}$, where $L$ is a lower triangular matrix and $U$ is an upper triangular matrix, And

$$ L = U_1^{-1} U_2^{-1} \cdots U_{n-1}^{-1} $$

where each matrix $U_i$ is a matrix which describes the $i$th step in forward elimination. The upper triangular matrix $U$ is given by

$$ U = U_{n-1} \cdots U_2 U_1 A. $$