Calculus of a Single Variable
Note the following
- $f\left(x\right) = c \quad \Rightarrow f^{\prime}\left( x \right) = 0$
- $f\left(x\right) = x^a \quad \Rightarrow f^{\prime}\left( x \right) = a x^{a-1}$
- $f\left(x\right) = a^x \quad \Rightarrow f^{\prime}\left( x \right) = a^x \ln a$
- $f\left(x\right) = \log_{b}x \quad \Rightarrow f^{\prime}\left( x \right) = \dfrac{1}{x \log_e b}$
- $f\left(x\right) = \sin\left(x\right) \quad \Rightarrow f^{\prime}\left( x \right) = \cos\left(x\right)$
- $f\left(x\right) = \cos\left(x\right) \quad \Rightarrow f^{\prime}\left( x \right) = -\sin\left(x\right)$
Thus for $f(x)=a^x$, when $a=e$ then, $f=e^x$ and $f^{\prime}\left( x \right) = f\left( x \right) = e^x$.
Similarly, for $f\left(x\right) = \log_{b}x$ when $b=e$, i.e. $f\left(x\right) = \log_e x = \ln x$, so $f^{\prime}\left( x \right) = \dfrac{1}{x}$.
This form can be understood as stating that if a function $f$ is written in terms of $g$, which itself depends on the variable $x$ (that is both $f$ and $g$ are dependent variables), then $f$ depends on $x$ as well, via the intermediate variable $g$.
For example, the function could be $\sin\left( x^2 \right)$, then write the function as $h(x) = f(g(x))$, where $f(x)=\sin(y(x))$ and $y(x)=x^2$, then the derivative is $f^{\prime} = 2x \cos\left( x^2\right)$.
Critical points are candidates for being local maxima or minima for the function.
Global Maximum
If $f(x)$ is a continuous function of a closed, bounded interval, then it always attains a global maximum and global minimum on that interval.
Linear Approximations
$$ y = f\left(x_0 \right) + f^{\prime} \left(x_0 \right) \left( x - x_0 \right). $$This is of the form $y = mx + b$ where the gradient is $m=\left(x_0 \right)$ and the intercept is given by $b= f\left(x_0 \right) - x_0 f^{\prime} \left(x_0 \right)$.
Second Derivatives
Assuming $x_0$ is a critical point of $f(x)$ then $f^{\prime}\left( x_0 \right)=0$, so the Taylor expansion about $x_0$ is
$$ f\left(x_0 + x \right) \approx f\left(x_0\right) + \dfrac{1}{2!} f^{\prime\prime}\left(x_0 \right) x^2 + \ldots $$Then
- If $f^{\prime\prime}\left(x_0 \right) < 0$, then $x_0$ is a local maxima
- If $f^{\prime\prime}\left(x_0 \right) > 0$, then $x_0$ is a local minima
- If $f^{\prime\prime}\left(x_0 \right) = 0$, then the test fails.