Power series and finite differences
Contents
1.2. Power series and finite differences#
A standard way to uncover the accuracy of a finite-difference formula is by expansion in power series.
Definition 1.2.1 (Truncation error for finite differences)
If \(d_h(u)\) is a finite-difference approximation to \(u'(0)\), the truncation error of the approximation is
The power of \(h\) in the leading term of \(T(h)\) is the order of accuracy of the FD formula.
Example 1.2.1
For the forward-difference formula applied to a smooth \(u(x)\),
Thus, it is a first-order method.
There is a lot of boilerplate in these calculations that we can condense using formal power series. Suppose \(hD\) represents the differentiation operator times the step size \(h\). Then define \(Z=\exp(hD)\). In Sage, this is done via the following:
R.<hD> = QQ[[]]
ser = hD + O(hD^8)
Z = ser.exp()
Z
We can interpret Taylor’s theorem as stating that \(Z\) is an operator that shifts a smooth function \(u(x)\) to the function \(u(x+h)\). Thus, for example, the forward difference formula is represented by the series
fd1 = Z - 1
fd1
Hence the truncation error is
which restates the example above. Similarly, the two-term backward difference is also first-order accurate:
bd1 = 1 - Z^(-1)
bd1
The centered difference, however, is second-order accurate:
cd2 = (Z - Z^(-1))/2
cd2
One interpretation is that the antisymmetry around \(h=0\) buys us an extra order.
Derivations#
The power series analysis can be used to derive new FD methods on equispaced grids. You decide which nodes to include (i.e., the stencil of the method), give each function value an unknown coefficient, expand everything around the evaluation point, and choose the coefficients to cancel out as many terms as possible.
There is a more direct method, though. Formally, \(Z=e^{hD}\) around \(h=0\), so that \(hD = \log(Z)\) around \(Z=1\). Hence, finite difference formulas are found by truncating expansions of \(\log(z)\) around \(z=1\):
var('z')
taylor(log(z),z,1,4)
If we didn’t already know the two-point forward difference formula, we’d start with
Hence,
expand(taylor(log(z),z,1,1))
If we find a formula using values at \(0\), \(h\), and \(2h\) to get \(u'(0)\), then we have \(a_2 Z^2 + a_1 Z + a_0 \approx \log(Z)\):
expand(taylor(log(z),z,1,2))
This corresponds to a 2nd-order forward difference on three points:
We can derive centered, backward, and other formulas by generalizing the trick just a little. In the centered three-point formula, we have
expand(taylor(z*log(z),z,1,2))
Similarly, we can derive a five-point centered formula via
expand(taylor(z^2*log(z),z,1,4))
Higher derivatives#
We can play the same games for formulas for the 2nd and higher derivatives. Here is the most popular formula for a second derivative: centered, using three nodes:
expand(taylor(z*log(z)^2,z,1,3))
We can verify that the formula is second-order accurate:
(Z^2-2*Z+1)/Z
I.e.,
This can be carried out to 4th-order using 5 points:
expand(taylor(z^2*log(z)^2,z,1,5))
(-1/12*Z^4 + 4/3*Z^3 - 5/2*Z^2 + 4/3*Z - 1/12)/Z^2
We can also do asymmetric (forward and backward) formulas:
expand(taylor(log(z)^2,z,1,3))
I.e.,