Next steps

4.8. Next steps#

The fixed point iteration has been used to prove results about convergence; see Quarteroni et al. [QSS07] for more details.

Solving nonlinear systems is a complex task. Some classic texts on the subject include Ortega and Rheinbolt [OR14], Kelley [Kel95], and many others.

Turning to nonlinear least squares, the widely used Levenberg–Marquardt algorithm has a connection to the authors’ home institution. Donald Marquardt completed an MS in mathematics and statistics at the University of Delaware in 1956 entitled “Application of Digital Computers to Statistics.” He developed the method while working at DuPont Corporation’s laboratories; he worked there from 1953 to 1991. He needed a better fitting method for experimental data generated at DuPont, and did just that. The method was published in 1963 in the Journal of the Society for Industrial and Applied Mathematics [Mar63]. In a note at the end, he thanks a referee for pointing out that Levenberg had published related ideas elsewhere [Lev44], and then cites that work. He also made a FORTRAN program available that contained an implementation of the program. There’s a lesson there for budding mathematical software designers: If you want people to use your algorithm or method, give away software to do it! An interesting discussion of the evolution of computing during his career at DuPont can be found in an interview [Hah95].

Optimization is closely linked to solving nonlinear equations as well. More info can be found, among many other places, in Strang [Str07], Quarteroni et al. [QSS07], and Conn et al. [CSV09].

Finally, we close with a remark about polynomials. Polynomials are common, and finding their roots is a frequently occurring task. As we saw in Chapter 1, however, the condition number of polynomial rootfinding can be large, and the possibility of complex roots is another challenge. It turns out that while general-purpose rootfinding can work for polynomials, it’s faster and more robust to use something tailored to the task. One of the best known ways is by converting the problem to one of finding the eigenvalues of a related matrix. A recent paper on the subject won an outstanding paper award from the Society for Industrial and Applied Mathematics [AMVW15]; it used a clever combination of the companion matrix together with a specialized QR factorization to develop a fast and backward stable method. That paper also contains a very good comparison among modern methods for polynomial rootfinding.