In the preceding chapter (§ 125) we proved that if $$f(x)$$ has a derivative $$f'(x)$$ throughout the interval $${[a, b]}$$ then $f(b) – f(a) = (b – a) f'(\xi),$ where $$a < \xi < b$$; or that, if $$f(x)$$ has a derivative throughout $${[a, a + h]}$$, then $\begin{equation*} f(a + h) – f(a) = hf'(a + \theta_{1} h), \tag{1} \end{equation*}$ where $$0 < \theta_{1} < 1$$. This we proved by considering the function $f(b) – f(x) – \frac{b – x}{b – a} \{f(b) – f(a)\}$ which vanishes when $$x = a$$ and when $$x = b$$.

Let us now suppose that $$f(x)$$ has also a second derivative $$f”(x)$$ throughout $${[a, b]}$$, an assumption which of course involves the continuity of the first derivative $$f'(x)$$, and consider the function $f(b) – f(x) – (b – x) f'(x) – \left(\frac{b – x}{b – a}\right)^{2} \{f(b) – f(a) – (b – a)f'(a)\}.$ This function also vanishes when $$x = a$$ and when $$x = b$$; and its derivative is $\frac{2(b – x)}{(b – a)^{2}} \{f(b) – f(a) – (b – a) f'(a) – \tfrac{1}{2}(b – a)^{2}f”(x)\},$ and this must vanish (§ 121) for some value of $$x$$ between $$a$$ and $$b$$ (exclusive of $$a$$ and $$b$$). Hence there is a value $$\xi$$ of $$x$$, between $$a$$ and $$b$$, and therefore capable of representation in the form $$a + \theta_{2}(b – a)$$, where $$0 < \theta_{2} < 1$$, for which $f(b) = f(a) + (b – a)f'(a) + \tfrac{1}{2}(b – a)^{2}f”(\xi).$

If we put $$b = a + h$$ we obtain the equation $\begin{equation*} f(a + h) = f(a) + hf'(a) + \tfrac{1}{2}h^{2} f”(a + \theta_{2}h), \tag{2} \end{equation*}$ which is the standard form of what may be called the Mean Value Theorem of the second order.

The analogy suggested by (1) and (2) at once leads us to formulate the following theorem:

Taylor’s or the General Mean Value Theorem. If $$f(x)$$ is a function of $$x$$ which has derivatives of the first $$n$$ orders throughout the interval $${[a, b]}$$, then $\begin{gathered} f(b) = f(a) + (b – a)f'(a) + \frac{(b – a)^{2}}{2!} f”(a) + \dots\\ + \frac{(b – a)^{n-1}}{(n – 1)!} f^{(n-1)}(a) + \frac{(b – a)^{n}}{n!}f^{(n)}(\xi),\end{gathered}$ where $$a < \xi < b$$; and if $$b = a + h$$ then $\begin{gathered} f(a + h) = f(a) + hf'(a) + \tfrac{1}{2} h^{2}f”(a) + \dots\\ + \frac{h^{n-1}}{(n – 1)!} f^{(n-1)}(a) + \frac{h^{n}}{n!} f^{(n)}(a + \theta_{n}h),\end{gathered}$ where $$0 < \theta_{n} < 1$$.

The proof proceeds on precisely the same lines as were adopted before in the special cases in which $$n = 1$$ and $$n = 2$$. We consider the function $F_{n}(x) – \left(\frac{b – x}{b – a}\right)^{n} F_{n}(a),$ where $\begin{gathered} F_{n}(x) = f(b) – f(x) – (b – x)f'(x) – \frac{(b – x)^{2}}{2!} f”(x) – \dots\\ – \frac{(b – x)^{n-1}}{(n – 1)!} f^{(n-1)}(x).\end{gathered}$ This function vanishes for $$x = a$$ and $$x = b$$; its derivative is $\frac{n(b – x)^{n-1}}{(b – a)^{n}} \left\{F_{n}(a) – \frac{(b – a)^{n}}{n!} f^{(n)}(x)\right\};$ and there must be some value of $$x$$ between $$a$$ and $$b$$ for which the derivative vanishes. This leads at once to the desired result.

In view of the great importance of this theorem we shall give at the end of this chapter another proof, not essentially distinct from that given above, but different in form and depending on the method of integration by parts.

Example LV

1. Suppose that $$f(x)$$ is a polynomial of degree $$r$$. Then $$f^{(n)}(x)$$ is identically zero when $$n > r$$, and the theorem leads to the algebraical identity $f(a + h) = f(a) + hf'(a) + \frac{h^{2}}{2!} f”(a) + \dots + \frac{h^{r}}{r!} f^{(r)}(a).$

2. By applying the theorem to $$f(x) = 1/x$$, and supposing $$x$$ and $$x + h$$ positive, obtain the result $\frac{1}{x + h} = \frac{1}{x} – \frac{h}{x^{2}} + \frac{h^{2}}{x^{3}} – \dots + \frac{(-1)^{n-1} h^{n-1}}{x^{n}} + \frac{(-1)^{n} h^{n}}{(x + \theta_{n} h)^{n+1}}.$

[Since $\frac{1}{x + h} = \frac{1}{x} – \frac{h}{x^{2}} + \frac{h^{2}}{x^{3}} – \dots + \frac{(-1)^{n-1} h^{n-1}}{x^{n}} + \frac{(-1)^{n} h^{n}}{x^{n}(x + h)},\quad$ we can verify the result by showing that $$x^{n}(x + h)$$ can be put in the form $$(x + \theta_{n}h)^{n+1}$$, or that $$x^{n+1} < x^{n}(x + h) < (x + h)^{n+1}$$, as is evidently the case.]

3. Obtain the formula $\begin{gathered} \sin(x + h) = \sin x + h\cos x – \frac{h^{2}}{2!}\sin x – h^{3}\frac{3!}\cos x + \dots\\ + (-1)^{n-1}\frac{h^{2n-1}}{(2n – 1)!}\cos x + (-1)^{n} h^{2n}\frac{2n!}\sin(x + \theta_{2n} h),\end{gathered}$ the corresponding formula for $$\cos(x + h)$$, and similar formulae involving powers of $$h$$ extending up to $$h^{2n+1}$$.

4. Show that if $$m$$ is a positive integer, and $$n$$ a positive integer not greater than $$m$$, then $(x + h)^{m} = x^{m} + \binom{m}{1}x^{m-1} h + \dots + \binom{m}{n – 1}x^{m-n+1} h^{n-1} + \binom{m}{n}(x + \theta_{n} h)^{m-n} h^{n}.$ Show also that, if the interval $${[x, x + h]}$$ does not include $$x = 0$$, the formula holds for all real values of $$m$$ and all positive integral values of $$n$$; and that, even if $$x < 0 < x + h$$ or $$x + h < 0 < x$$, the formula still holds if $$m – n$$ is positive.

5. The formula $$f(x + h) = f(x) + hf'(x + \theta_{1}h)$$ is not true if $$f(x) = 1/x$$ and $$x < 0 < x + h$$. [For $$f(x + h) – f(x) > 0$$ and $$hf'(x + \theta_{1} h) = -h/(x + \theta_{1} h)^{2} < 0$$; it is evident that the conditions for the truth of the Mean Value Theorem are not satisfied.]

6. If $$x = -a$$, $$h = 2a$$, $$f(x) = x^{1/3}$$, then the equation $f(x + h) = f(x) + hf'(x + \theta_{1} h)$ is satisfied by $$\theta_{1} = \frac{1}{2} \pm\frac{1}{18}\sqrt{3}$$. [This example shows that the result of the theorem may hold even if the conditions under which it was proved are not satisfied.]

7. Newton’s method of approximation to the roots of equations. Let $$\xi$$ be an approximation to a root of an algebraical equation $$f(x) = 0$$, the actual root being $$\xi + h$$. Then $0 = f(\xi + h) = f(\xi) + hf'(\xi) + \tfrac{1}{2} h^{2}f”(\xi + \theta_{2}h),$ so that $h = -\frac{f(\xi)}{f'(\xi)} – \tfrac{1}{2} h^{2} \frac{f”(\xi + \theta_{2}h)}{f'(\xi)}.$

It follows that in general a better approximation than $$x = \xi$$ is $x = \xi – \frac{f(\xi)}{f'(\xi)}.$ If the root is a simple root, so that $$f'(\xi + h) \neq 0$$, we can, when $$h$$ is small enough, find a positive constant $$K$$ such that $$|f'(x)| > K$$ for all the values of $$x$$ which we are considering, and then, if $$h$$ is regarded as of the first order of smallness, $$f(\xi)$$ is of the first order of smallness, and the error in taking $$\xi – \{f(\xi)/f'(\xi)\}$$ as the root is of the second order.

8. Apply this process to the equation $$x^{2} = 2$$, taking $$\xi = 3/2$$ as the first approximation. [We find $$h = -1/12$$, $$\xi + h = 17/12 = 1.417\dots$$, which is quite a good approximation, in spite of the roughness of the first. If now we repeat the process, taking $$\xi = 17/12$$, we obtain $$\xi + h = 577/408 = 1.414\ 215\dots$$, which is correct to $$5$$ places of decimals.]

9. By considering in this way the equation $$x^{2} – 1 – y = 0$$, where $$y$$ is small, show that $$\sqrt{1 + y} = 1 + \frac{1}{2} y – \{\frac{1}{4}y^{2}/(2 + y)\}$$ approximately, the error being of the fourth order.

10. Show that the error in taking the root to be $$\xi – (f/f’) – \frac{1}{2}(f^{2}f”/f’^{3})$$, where $$\xi$$ is the argument of every function, is in general of the third order.

11. The equation $$\sin x = \alpha x$$, where $$\alpha$$ is small, has a root nearly equal to $$\pi$$. Show that $$(1 – \alpha)\pi$$ is a better approximation, and $$(1 – \alpha + \alpha^{2})\pi$$ a better still. [The method of Exs. 7–10 does not depend on $$f(x) = 0$$ being an algebraical equation, so long as $$f’$$ and $$f”$$ are continuous.]

12. Show that the limit when $$h \to 0$$ of the number $$\theta_{n}$$ which occurs in the general Mean Value Theorem is $$1/(n + 1)$$, provided that $$f^{(n+1)}(x)$$ is continuous.

[For $$f(x + h)$$ is equal to each of $f(x) + \dots + \frac{h^{n}}{n!} f^{(n)}(x + \theta_{n}h),\quad f(x) + \dots + \frac{h^{n}}{n!} f^{(n)}(x) + \frac{h^{n+1}}{(n + 1)!} f^{(n+1)}(x + \theta_{n+1}h),$ where $$\theta_{n+1}$$ as well as $$\theta_{n}$$ lies between $$0$$ and $$1$$. Hence $f^{(n)}(x + \theta_{n}h) = f^{(n)}(x) + \frac{hf^{(n+1)}(x + \theta_{n+1}h)}{n + 1}.$ But if we apply the original Mean Value Theorem to the function $$f^{(n)}(x)$$, taking $$\theta_{n}h$$ in place of $$h$$, we find $f^{(n)}(x + \theta_{n}h) = f^{(n)}(x) + \theta_{n}hf^{(n+1)}(x + \theta\theta_{n}h),$ where $$\theta$$ also lies between $$0$$ and $$1$$. Hence $\theta_{n} f^{(n+1)}(x + \theta\theta_{n} h) = \frac{f^{(n+1)}(x + \theta_{n+1} h)}{n + 1},$ from which the result follows, since $$f^{(n+1)}(x + \theta\theta_{n} h)$$ and $$f^{(n+1)}(x + \theta_{n+1} h)$$ tend to the same limit $$f^{(n+1)}(x)$$ as $$h \to 0$$.]

13. Prove that $$\{f(x + 2h) – 2f(x + h) + f(x)\}/h^{2} \to f”(x)$$ as $$h \to 0$$, provided that $$f”(x)$$ is continuous. [Use equation (2) of § 147.]

14. Show that, if the $$f^{(n)}(x)$$ is continuous for $$x = 0$$, then $f(x) = a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n},$ where $$a_{r} = f^{(r)}(0)/r!$$ and $$\epsilon_{x} \to 0$$ as $$x \to 0$$.1

15. Show that if $a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n} = b_{0} + b_{1}x + b_{2}x^{2} + \dots + (b_{n} + \eta_{x}) x^{n},$ where $$\epsilon_{x}$$ and $$\eta_{x}$$ tend to zero as $$x \to 0$$, then $$a_{0} = b_{0}$$, $$a_{1} = b_{1}$$, …, $$a_{n} = b_{n}$$. [Making $$x \to 0$$ we see that $$a_{0} = b_{0}$$. Now divide by $$x$$ and afterwards make $$x \to 0$$. We thus obtain $$a_{1} = b_{1}$$; and this process may be repeated as often as is necessary. It follows that if $$f(x) = a_{0} + a_{1}x + a_{2}x^{2} + \dots + (a_{n} + \epsilon_{x}) x^{n}$$, and the first $$n$$ derivatives of $$f(x)$$ are continuous, then $$a_{r} = f^{(r)}(0)/r!$$.]

1. It is in fact sufficient to suppose that $$f^{(n)}(0)$$ exists. See R. H. Fowler, “The elementary differential geometry of plane curves” (Cambridge Tracts in Mathematics, No. 20, p. 104).↩︎