In linear algebra, the condition number plays a well-known role in quantifying the reliability of solutions. In the numerical solution of differential equations, the phenomenon of sensitivity to perturbations is often illustrated, but it is less common to see a condition number. In this post, I compare and contrast two approaches to quantifying stability. Examples include a dart throw (from a book by Fox and Mayers) and a devious hole of miniature golf.

Interpolation is a surprisingly difficult problem. The basic idea is simple: given a set of points in the -plane, fill the gaps to produce a continuous function. A simple approach is piecewise-linear interpolation:

However, the resulting function is not smooth. Lack of smoothness can cause various problems—theorems may not apply, methods may fail. Plus, if the data points are assumed to come from some smooth function lurking in the background, then a nonsmooth interpolant will likely not produce a very close fit.

A natural candidate for a smooth interpolant is a polynomial. Polynomials are ubiquitous in high school algebra, most often written in the form . Using calculus, a degree- polynomial can alternatively be characterized as a function whose th derivative equals zero. Polynomials are smooth and relatively easy to work with.

The problem of polynomial interpolation is to find a single high-degree polynomial to fit the data points rather than a sequence of short linear functions. A robust solution to this problem is harder than we might expect.

Let be a smooth function, and consider replacing by a close-fitting polynomial . This might be desirable if is defined by a complicated formula or if it is defined implicitly as the solution of an equation or an indefinite integral, for example. A natural idea is to sample at regularly spaced points—

—and then to construct a polynomial whose graph passes through those points. Unfortunately, this often fails! The classic example is Runge’s function . The plotted function above is in fact Runge’s function. The polynomial that interpolates the data points above is the following:

This is bad! The polynomial introduces extreme oscillations that are not present in the graph of . Even worse, the oscillations increase in frequency and amplitude as the number of data points is increased. This phenomenon is known as the Runge phenomenon.

There is a surprisingly simple resolution. Instead of sampling at uniformly spaced points, sample on a Chebyshev grid

Notice how the Chebyshev points are more dense near the endpoints of the interval, where the oscillations were previously most severe. As long as is sufficiently smooth, the Runge phenomenon will not occur.

…

The cover of my book Numerical Analysis: Theory and Experiments features the following shapes:

These show polynomial interpolants in the complex plane. The “wavy ribbon” comes from uniformly spaced sample points, while the “saddle” comes from Chebyshev points. In the first case, we see a more complete picture of the Runge phenomenon. In the second case, we see that the Chebyshev interpolant captures behavior not just along the real line but also into the complex plane.

The Runge phenomenon occurs when there are singularities in the complex plane sufficiently close to the interpolation interval on the real line. In the case of Runge’s function , there are singularities at . These are suggested by the two “pommels” at the top of the saddle. If the graph were extended farther into the complex plane, these peaks would explode to infinity. Chebyshev interpolation performs well even in the presence of such infinite singularities.

The Runge phenomenon is discussed in chapter 9 of my book, along with another, arguably greater, danger called “ill conditioning.”