Taylor Series

The Taylor series, or Taylor expansion of a function, is defined as

Definition: Taylor Series

For a function $f : \mathbb{R} \mapsto \mathbb{R}$ which is infinitely differentiable at a point $c$, the Taylor series of $f(c)$ is given by

$$ \begin{equation*} \sum\limits_{k=0}^{\infty} \dfrac{ f^{(k)} \left( c \right) }{k!} \left( x - c \right)^{k}. \end{equation*} $$

This is a infinite series of powers of the variable $x$, which is convergent for some values of $x$ such that $\left| x - c \right| < r $ where $r$ is the radius of convergence.

Theorem: Taylor's Theorem

For a function $f \in C^{n+1}\left([a, b]\right)$, i.e. $f$ is $(n+1)$-times continuously differentiable in the interval $[a, b]$, then for some $c$ in the interval, the function can be written as

$$ \begin{equation*} f\left( x \right) = \sum\limits_{k=0}^{n} \dfrac{f^{(k)} \left(c\right) }{k!} \left( x- c \right)^{k} + \dfrac{f^{(n+1)} \left( \xi \right) }{\left( n + 1 \right)!} \left( x - c \right)^{n+1} \end{equation*} $$

for some value $\xi \in \left[ a, b \right]$ where

$$ \begin{equation*} \lim\limits_{\xi \rightarrow c} \dfrac{ f^{(n+1)} \left( \xi \right) }{ \left( n + 1 \right)!} \left( x - c \right)^{n+1} = 0. \end{equation*} $$
Theorem: Taylor's Theorem for Multivariate Functions

For a function $f \, : \, \mathbb{R}^n \mapsto \mathbb{R}$ which is differentiable around $\boldsymbol{a}$, then the Taylor expansion can be generalised as

$$ \begin{equation*} f\left( \boldsymbol{x} + \boldsymbol{a} \right) = f\left( \boldsymbol{x} \right) + \boldsymbol{a}\cdot J + \dfrac{1}{2} \boldsymbol{a}^T H \boldsymbol{a} + \ldots \end{equation*} $$

where $J$ is the Jacobian and $H$ is the Hessian.

Example:

An .ipynb notebook with an example of the Taylor series for $\sin\left(x\right)$ can be accessed online [here].

It can be downloaded from [here] as a python file or downloaded as a notebook from [here].

Theorem: Rolle's Theorem

If a real-valued function $f$ is continuous on a proper closed interval $[a, b]$, differentiable on the open interval $(a, b)$, and has ${f (a) = f (b)}$, then there exists at least one $c$ in the open interval $(a, b)$ such that

$$ f^\prime (c) = 0. $$
Theorem: Mean Value Theorem

The theorem states that if $f$ is a continuous function on the closed interval $[a ,b]$ and differentiable on the open interval $(a, b)$, then there exists a point ${c \in (a, b)}$ such that the tangent at $c$ is parallel to the secant line through the endpoints ${\big(a, f(a) \big)}$ and ${\big(b, f(b) \big)}$, that is,

$$ f^\prime (c) = \dfrac{f(b) - f(a)}{b - a}. $$