There are many ways to approximate a given continuous function (I will consider the interval
for convenience.) For example, one can use piecewise linear interpolation through the points
, where
. The resulting piecewise linear function
has some nice properties: for example, it is increasing if
is increasing. But it is not smooth.
A convenient way to represent piecewise linear interpolation is the sum where the functions
are the triangles shown below:
.

The functions form a partition of unity, meaning that
and all
are nonnegative. This property leads to the estimate
The latter sum is small because when is close to
, the first factor
is small by virtue of continuity, while the second factor
is bounded by
. When
is far from
, the second factor
is zero, so the first one is irrelevant. The upshot is that
is uniformly small.
But if we want a smooth approximation , we need a smooth partition of unity
. But not just any set of smooth nonnegative functions that add up to
is equally good. One desirable property is preserving monotonicity: if
is increasing, then
should be increasing, just as this works for piecewise linear interpolation. What does this condition require of our partition of unity?
An increasing function can be expressed as a limit of sums of the form where
and
is the Iverson bracket: 1 if true, 0 if false. By linearity, it suffices to have increasing
for the case
. In this case
is simply
for some
,
. So we want all
to be increasing functions. Which is the case for the triangular partition of unity, when each
looks like this:

One smooth choice is Bernstein basis polynomials: . These are nonnegative on
, and the binomial formula shows
. Are the sums
increasing with
? Let’s find out. By the product rule,
In the second sum the term with vanishes, and the terms with
can be rewritten as
, which is
, which is
. After the index shift
this becomes identical to the terms of the first sum and cancels them out (except for the first one). Thus,
To summarize: the Bernstein polynomials are monotone whenever
is. On the other hand, the proof that
uniformly is somewhat complicated by the fact that the polynomial basis functions
are not localized the way that the triangle basis functions are: the factors
do not vanish when
is far from
. I refer to Wikipedia for a proof of convergence (which, by the way, is quite slow).

Is there some middle ground between non-smooth triangles and non-localized polynomials? Yes, of course: piecewise polynomials, splines. More specifically, B-splines which can be defined as follows: B-splines of degree are the triangle basis functions shown above; a B-spline of degree
is the moving averages of a
-spline of degree
with a window of length
. The moving average of
can be written as
. We get a partition of unity because the sum of moving averages is the moving average of a sum, and averaging a constant function does not change it.
The splines of even degrees are awkward to work with… they are obtained from the triangles by taking those integrals with an odd number of times, which makes their knots fall in the midpoints of the uniform grid instead of the grid points themselves. But I will use
anyway, because this degree is enough for
-smooth approximation.
Recall that a triangular basis function has slope
and is supported on an interval
where
. Accordingly, its moving average
will be supported on
. Since
, the second derivative
is
when
, is
when
, and is
again when
. This is enough to figure out the formula for
:
These look like:

Nice! But wait a moment, the sum near the endpoints is not constant: it is less than 1 because we do not get a contributions of two splines to the left and right of the interval. To correct for this boundary effect, replace with
and
with
, using “ghost” elements of the basis that lie outside of the actual grid. Now the quadratic B-spline basis is correct:

Does this partition of unity preserve monotinicity? Yes, it does: which is nonnegative because the sum
is an increasing piecewise linear function, as noted previously. Same logic works for B-splines of higher degree.
In conclusion, here is a quadratic B-spline approximation (orange) to a tricky increasing function (blue).

One may wonder why the orange curve deviates from the line at the end – did we miss some boundary effect there? Yes, in a way… the spline actually approximates the continuous extension of our original function by constant values on the left and right. Imagine the blue graph continuing to the right as a horizontal line: this creates a corner at and the spline is smoothing that corner. To avoid this effect, one may want to extend
in a better way and then work with the extended function, not folding the ghosts
into
.
But even so, B-spline achieves a better approximation than the Bernstein polynomial with the same number of basis functions (eight):

The reason is the non-local nature of the polynomial basis , which was noted above. Bernstein polynomials do match the function perfectly at the endpoints, but this is small consolation.