## Differentiation — continued

Differentiation of the Sum, Difference, Product and Quotient of Two Functions

Theorem:

If $f, g: I \rightarrow \Re$ are differentiable at $x_{0} \in I$, where I is an open interval in $\Re$, then so are $f \pm g$, fg at $x_{0}$. Furthermore, $\frac{f}{g}$ is differentiable at $x_{0}$ if $g(x_{0}) \neq 0$. We have

(a) derivative of $f \pm g$ at $x_{0}$ is $f^{'}(x_{0}) \pm g^{'}(x_{0})$

(b) derivative of $f.g$ at $x_{0}$ is $f^{'}(x_{0})g(x_{0})+f(x_{0})g^{'}(x_{0})$

(c) derivative of $\frac{f}{g}$ at $x_{0}$ is $\frac{f^{'}(x_{0})g(x_{0})-f(x_{0})g^{'}(x_{0})}{{g(x_{0})}^{2}}$ if $g^{'}(x_{0}) \neq 0$

The proofs are straightforward and therefore omitted.

We also have

Theorem (Chain Rule):

Let I and J be two intervals in $\Re$ and $f: I \rightarrow J$, and $g: J \rightarrow \Re$ be differentiable at $x_{0} \in I$ and $f(x_{0}) \in J$ respectively. Then, $h \equiv g \circ f : I \rightarrow \Re$ is differentiable at $x_{0}$ and

$h^{'}(x_{0})=g^{'}(f(x_{0}))f^{'}(x_{0})$

Note that $h = g \circ f$ is defined as $h(x)=g(f(x))$.

Proof.

Let us write $y=f(x)$ so that by the continuity of f at $x_{0}$, we have that as $x \rightarrow x_{0}$, $y \rightarrow y_{0}=f(x_{0})$. Since g is differentiable at $y_{0}$, we have

$g(y)-g(y_{0})=(g^{'}(y_{0})+r_{1}(y,y_{0}))(y-y_{0})$

Here $r_{1}(y,y_{0})$ as $y \rightarrow y_{0}$. Again, since f is differentiable at $x_{0}$, we have

$f(x)-f(x_{0})=(f^{'}(x_{0})+r_{2}(x,x_{0}))(x-x_{0})$.

Here, $r_{2}(x,x_{0}) \rightarrow 0$ as $x \rightarrow x_{0}$. Thus, we have

$f(x)-g(f(x_{0})) = (g^{'}(f(x_{0}))+r_{1}(y,y_{0}))(f(x)-f(x_{0}))$, which equals

$(g^{'}(y_{0})+r_{1}(y,y_{0}))(f^{'}(x_{0})+r_{2}(x,x_{0}))(x-x_{0})$, which in turn, equals

$g^{'}(f(x_{0}))f^{'}(x_{0})(x-x_{0})+(x-x_{0})r_{3}$ where

$r_{3}=g^{'}(f(x_{0}))r_{2}(x,x_{0})+f^{x_{0}}r_{1}(y,y_{0})+r_{1}(y,y_{0})r_{2}(x,x_{0})$.

Surely, $r_{3} \rightarrow 0$ as $x \rightarrow x_{0}$ and hence,

$h^{'}(x_{0})=g^{'}(f(x_{0}))f^{'}(x_{0})$.

The above result is also often called the Chain Rule.

Differential Notation of Leibniz

For a differentiable function f, if we write $y=f(x)$ and $y+\triangle y=f(x+\triangle x)$, then we get

$\lim_{\triangle x \rightarrow 0} \frac{\triangle y}{\triangle x}=\lim_{\triangle x \rightarrow 0}\frac{f(x+\triangle x)-f(x)}{\triangle x}=f^{'}(x)$

The expression

$\lim_{\triangle x \rightarrow 0}\frac{\triangle y}{\triangle x}$ is often written as $\frac{dy}{dx}$.

It is NOT true that $\frac{dy}{dx}$ is the quotient of limits of $\triangle y$ and of $\triangle x$ because both of them tend to zero. It should rather be thought of as an operator (or operation) $\frac{d}{dx}$ is the operation of differentiation operating on the variable y so that we have

$\frac{d}{dx}(y)=\frac{dy}{dx}$.

The operator $\frac{d}{dx}$ has the property $\frac{d}{dx}(y+u)=\frac{dy}{dx}+\frac{du}{dx}$ and

$\frac{d}{dx}(cy)=c\frac{dy}{dx}$, and for f and g, two differentiable functions with the domain of g containing the range of f, if we write $y=f(x)$ and $u=g(y)$, we have $u=g(f(x))$, then the chain rule gives $\frac{dy}{dx}=\frac{du}{dy}. \frac{dy}{dx}$. In the case, when $y=f(x)$ so that $\frac{dy}{dx}=f^{x}$ we write, following the German mathematician Leibnitz,

$dy=f^{'}(x)dx$,

and dy and dx are called the differentials of y and x respectively.

Remark:

Let $f: \Re^{2} \rightarrow \Re$ be a function. We say that f is differentiable at $(x_{0},y_{0}) \in \Re^{2}$ if we can find numbers A, B depending on $(x_{0},y_{0})$ only, so that

$f(x,y) - f(x_{0},y_{0})=A(x-x_{0})+B(y-y_{0})+R(x,y,x_{0},y_{0})$ such that

$\lim \frac{R(x,y,x_{0},y_{0})}{\sqrt{(x-x_{0})^{2}-(y-y_{0})^{2}}}=0$ as $(x,y) \rightarrow (x_{0},y_{0})$

Observe that

$A=\lim_{x \rightarrow x_{0}}\frac{f(x,y_{0})-f(x_{0},y_{0})}{x-x_{0}}$ and $B=\lim_{y \rightarrow y_{0}}\frac{f(x_{0},y)-f(x_{0},y_{0})}{y-y_{0}}$

We call A and B the partial derivatives of f with respect to x and y respectively at $(x_{0},y_{0})$, and we write

$A=\frac{\partial f}{\partial x}(x_{0},y_{0})$, $B=\frac{\partial f}{\partial y}(x_{0},y_{0})$

Sometimes, we also  write $\frac{\partial f}{\partial x}(x_{0},y_{0}) = f_{x}(x_{0},y_{0})$ and $\frac{\partial f}{\partial y}(x_{0},y_{0})=f_{y}(x_{0},y_{0})$. Again, as before, $\frac{\partial}{\partial x}$ and $\frac{\partial}{\partial y}$ may be thought of as operators which, operating on a function, give its partial derivatives with respect to x and y respectively.

Suppose that $\phi: \Re^{2} \rightarrow \Re$ is differentiable, and that x and $y: \Re \rightarrow \Re$ are differentiable functions. Furthermore, let $u: \Re \rightarrow \Re$ be defined by

$u(t)=\phi (x(t),y(t))$

It is not difficult to show that (Exercise!)

$u^{'}(t)=\frac{\partial \phi}{\partial x} x^{'}(t)+\frac{\partial \phi}{\partial y} y^{'}(t)$.

In Leibnitz’s notation, it reads $du=\frac{\partial u}{\partial x}dx + \frac{\partial u}{\partial y} dy$.

As an application of this idea, consider the notion of equipotential surface in electrostatics. An electrically charged (infinite) cylindrical conductor is one such and because of the symmetry, it is enough to look at only a horizontal section of the cylinder which is a closed curve, that is, the problem gets reduced to a 2-dimensional one. This equipotential curve is given by the equation $\phi (x,y)=constant$, say c, where $\phi$ is the real valued potential function. What is the electric field outside the curve? Using Leibnitz’s notation as described above, we can write $d\phi=\frac{\partial \phi}{\partial x}dx + \frac{\partial \phi}{\partial y}dy$.

Since $\phi$ is a constant, the relation above reads as

$\frac{\partial \phi}{\partial x} x^{'}(t) + \frac{\partial \phi}{\partial y} y^{'}(t)=0$,

that is, the vector $(\frac{\partial \phi}{\partial x}, \frac{\partial \phi}{\partial y})$ is orthogonal to the tangent vector $(x^{'}(t),y^{'}(t))$.

Recalling that the electric field E at a point $(x,y)$ outside has components $E_{1}=-\frac{\partial \phi}{\partial x}$ and $E_{2}=-\frac{\partial \phi}{\partial y}$, we conclude that the electric field E is along the outward normal. For example, for a right circular cylinder, the electric field is along the radial direction of every section.

We have seen that if f is differentiable at $x_{0}$, then $f(x)-f(x_{0})-f^{'}(x_{0})(x-x_{0})$ is of smaller order than $(x-x_{0})$ as $x \rightarrow x_{0}$.

Writing $\triangle y=f(x)-f(x_{0})$, $\triangle x = (x-x_{0})$ we get

$\triangle y = f^{'}(x_{0}) \triangle x$

which is of smaller order than $\triangle x$. In other words, we are claiming that the increment in y is proportional to the increment in x when it is *small*, which is the principle of proportional parts. We have put the word small within asterisks as it is a relative term. Let us consider an example.

Let $f(x) = \sqrt {x}$ with $x \geq 0$. Then, $f^{'}(x)=\frac{1}{2\sqrt{x}}$ for $x > 0$. So, $f(16)=4$, $f(17)=f(16)+f^{'}(16).1+r.1$, since $\triangle x = 1$. This gives us $\sqrt{17}=4+\frac {1}{8} + r = 4.125+r$, whereas computation on a hand calculator will give $\sqrt{17}=4.1231 \ldots$ This shows that $|r|<0.002$ and thus the differential approximates the increment in $f(x)$ correct at least to the second place of decimal.

More later,

Nalin Pithwa

This site uses Akismet to reduce spam. Learn how your comment data is processed.