Chapter 6 Successive Approximations
In this section, we use successive approximations to demonstrate the existence of a unique solution to the differential equation, \(y'=y, y(0)=1,\) which you will recall from calculus is the function, \(E(x) = e^x\text{.}\) There are many ways to define the “exponential function.” Here are a few.
Define sequences and convergence, then show that the sequence, \(a_n = (1 + \frac{1}{n})^n\) for \(n=1,2,\dots\) is increasing and bounded above. Then apply the Completeness Axiom to assure that it converges to some number. Call that number \(e\text{.}\) Define general exponential functions of the form \(f(x) = b^x\text{.}\) When \(b=e\) you have the natural exponential function.
Develop differential and integral calculus and then define the integral \(\displaystyle L(x) = \int_1^x \frac{1}{t} \; dt\text{.}\) Show that this function is strictly increasing, hence one-to-one and then define a function \(E\text{,}\) the natural exponential function, to be the inverse of \(L\text{.}\)
Develop sequences, series, and convergence and show that for each real number, \(x,\) the series \(\displaystyle \sum_{i=0}^\infty \frac{x^i}{i!}\) converges. Now define \(\displaystyle E(x) = \sum_{i=0}^\infty \frac{x^i}{i!}\text{.}\)
-
Develop differential and integral calculus and then consider the question, does there exist a function \(f\) that satisfies:
\(f(0)=1\) and
\(f'(t) = f(t)\) for all \(t \in \R?\)
All of these approaches lead to the functions \(E(x) = e^x\) and \(L(x) = \ln(x)\) that you are familiar with. It is the latter path that we take because it makes use of much of the analysis that you have already developed and serves as a brief introduction to series.
Problem 6.1 is a “warm-up” for the next sequence of problems. For this problem, assume that you do know that the function \(E(x) = e^x\) exists and that you remember all your calculus(!) and that the usual rules of differentiation and integration apply. For this problem only, if you need a reminder of Taylor series, you may look at the web or a book.
Problem 6.1
Successive approximations, Picard's iterates.
Compute the Taylor Series for \(E(x) = e^x\text{.}\)
Show that if \(y\) is differentiable on \([0,1]\) and \(y'(t) =y(t)\) for all \(t \in [0,1]\) and \(y(0)=1\) then \(\displaystyle y(t) = 1 + \int_0^t y\text{.}\)
Show that if \(y\) is differentiable on \([0,1]\) and \(\displaystyle y(t) = 1 + \int_0^t y\) then \(y'=y\) and \(y(0)=1\text{.}\)
Let \(y_0 = 1\) and for each \(n= 1,2,\dots\) let \(\displaystyle y_n(t) = y_0 + \int_0^t y_{n-1}\) for all \(t \in [0,1]\text{.}\) Compute by hand \(y_1,y_2,y_3,\dots\text{.}\)
Now that you've completed the “warm-up” exercise, forget that you know that there exists a function \(E(x) = e^x\) and close your calculus book or website.
Theorem 6.2
Let \(f_1, f_2, \dots\) be the sequence of functions defined in the previous problem and show that if \(x \in [0,1]\) then \(f_1(x), f_2(x), f_3(x) \dots\) converges to some number.
Since for each \(x \in [0,1]\) the sequence \(f_1(x), f_2(x), f_3(x), \dots\) converges, we may define a function \(E\) on \([0,1]\) as follows: for each \(x \in [0,1]\) let \(E(x)\) be the number to which \(f_1(x), f_2(x), f_3(x) \dots\) converges. Now we have that the sequence \(f_1, f_2, f_3, \dots\) converges pointwise to the function \(E\) on \([0,1]\text{.}\)
Theorem 6.3
Show that the sequence of functions \(f_1, f_2, f_3, \dots\) just defined converges uniformly to the function \(E\) on \([0,1]\) .
Theorem 6.4
If \(f_1, f_2, \dots\) converges uniformly to \(h\) on \([0,1]\) and \(\displaystyle s \in [0,1]\) and \(\displaystyle \int_0^s f_n\) exists for all \(n=1,2, \dots\) then the sequence of numbers \(\displaystyle \int_0^s f_1, \int_0^s f_2, \dots\) converges to the number \(\displaystyle \int_0^s h\text{.}\)
Theorem 6.5
Let \(E\) be the function defined immediately after Theorems 5.8 and Theorem 6.2. Let \(g\) be the function defined by \(\displaystyle g(s) = 1 + \int_0^s E\text{.}\) Show that \(E=g\) on \([0,1]\) and \(E(0)=1\text{.}\)
Theorem 6.6
Show that if \(L\) is the function with domain all differentiable functions and defined by \(L(u) = u'-u\) then \(y=0\) is the unique solution to \(L(y)=0\) and \(y(0)=0\text{.}\)
Theorem 6.7
Suppose \(t_0, x_0 \in \mathbb{R}\) and show that there are not two solutions to \(L(y)=0\) and \(y(t_0)=x_0\text{.}\)
Theorem 6.8
Show there is a unique solution to the initial value problem \(y'' + y = 0, y(0)=0, y'(0)=1\) as follows:
Convert the second order equation to a first order system, \(\begin{pmatrix}u \cr v
\end{pmatrix} ' = A \begin{pmatrix}u \cr v
\end{pmatrix} , \begin{pmatrix}u \cr v
\end{pmatrix} (0) = \begin{pmatrix}0 \cr 1
\end{pmatrix}\) where \(A\) is a \(2 \times 2\) matrix.
Apply Picard's iteration to obtain sequences of functions, \(u_0, u_1, \dots\) and \(v_0, v_1, \dots\)
Show that there are functions \(u\) and \(v\) so that \((u_n)_{n=1}^\infty \rightarrow u \mbox{ and } (v_n)_{n=1}^\infty \rightarrow ~v\) and \(\begin{pmatrix}u \cr v
\end{pmatrix}\) is a solution to the first order system and therefore a solution to the initial value problem.