Transcendental-free Riemann-Lebesgue lemma

Calculus books tend to introduce transcendental functions (trigonometric, exponential, logarithm) early. Analysis textbooks such as Principles of Mathematical Analysis by Rudin tend to introduce them later, because of how long it takes to develop enough of the theory of power series.

The Riemann-Lebesgue lemma involves either trigonometric or exponential functions. But the following version works with the “late transcendentals” approach.

Transcendental-free Riemann-Lebesgue Lemma

TFRLL. Suppose that {f\colon [a, b]\to \mathbb R} and {g\colon \mathbb R\to \mathbb R} are continuously differentiable functions, and {g} is bounded. Then {\int_a^b f(x)g'(nx)\,dx \to 0} as {n\to\infty}.

The familiar form of the lemma is recovered by letting {g(x) = \sin x} or {g(x) = \cos x}.

Proof. By the chain rule, {g'(nx)} is the derivative of {g(nx)/n}. Integrate by parts:

{\displaystyle  \int_a^b f(x)g'(nx)\,dx = \frac{f(b)g(nb)}{n} - \frac{f(a)g(na)}{n} - \int_a^b f'(x)\frac{g(nx)}{n}\,dx }

By assumption, there exists a constant {M} such that {|g|\le M} everywhere. Hence {\displaystyle \left| \frac{f(b)g(nb)}{n}\right| \le \frac{|f(b)| M}{n} }, {\displaystyle \left|\frac{f(a)g(na)}{n}\right| \le \frac{|f(a)| M}{n}}, and {\displaystyle \left|\int_a^b f'(x)\frac{g(nx)}{n}\,dx\right| \le \frac{M}{n} \int_a^b |f'(x)|\,dx}. By the triangle inequality,

{\displaystyle \left|\int_a^b f(x)g'(nx)\,dx \right| \le \frac{M}{n}\left(|f(b)|+|f(a)| + \int_a^b |f'(x)|\,dx \right) \to 0 }

completing the proof.

As a non-standard example, TFRLL applies to, say, {g(x) = \sin (x^2) } for which {g'(x) = 2x\cos (x^2)}. The conclusion is that {\displaystyle \int_a^b f(x) nx \cos (n^2 x^2) \,dx \to 0 }, that is, {\displaystyle  \int_a^b xf(x) \cos (n^2 x^2) \,dx = o(1/n)} which seems somewhat interesting. When {0\notin [a, b]}, the factor of {x} can be removed by applying the result to {f(x)/x}, leading to {\displaystyle \int_a^b f(x) \cos (n^2 x^2) \,dx = o(1/n)}.

What if we tried less smoothness?

Later in Rudin’s book we encounter the Weierstrass theorem: every continuous function on {[a, b]} is a uniform limit of polynomials. Normally, this would be used to make the Riemann-Lebesgue lemma work for any continuous function {f}. But the general form given above, with an unspecified {g}, presents a difficulty.

Indeed, suppose {f} is continuous on {[a, b]}. Given {\epsilon > 0 }, choose a polynomial {p} such that {|p-f|\le \epsilon} on {[a, b]}. Since {p} has continuous derivative, it follows that {\int_a^b p(x)g'(nx)\,dx \to 0}. It remains to show that {\int_a^b p(x)g'(nx)\,dx} is close to {\int_a^b f(x)g'(nx)\,dx}. By the triangle inequality,

{\displaystyle \left| \int_a^b (p(x) - f(x))g'(nx)\,dx \right| \le \epsilon \int_a^b |g'(nx)|\,dx }

which is bounded by … um. Unlike for {\sin } and {\cos}, we do not have a uniform bound for {|g'|} or for its integral. Indeed, with {g(x) = \sin x^2} the integrals {\displaystyle  \int_0^1 |g'(nx)| \,dx = \int_0^1 2nx |\cos (n^2x^2)| \,dx  } grow linearly with {n}. And this behavior would be even worse with {g(x) = \sin e^x}, for example.

At present I do not see a way to prove TFRLL for continuous {f}, let alone for integrable {f}. But I do not have a counterexample either.

Branching

Multiple kinds of branching here. First, the motorsport content has been moved to formula7.blog. Two blogs? Well, it became clear that my Stack Exchange activity, already on hiatus since 2018, is not going to resume (context: January 14, January 15, January 17). But typing words in boxes is still a hobby of mine.

There may be yet more branching in the knowledge market space, with Codidact and TopAnswers attempting to rise from the ashes of Stack Exchange. (I do not expect either project to have much success.)

Also, examples of branching in complex analysis are often limited to the situations where any two branches differ either by an additive constant like {\log z} or by a multiplicative constant like {z^p}. But different branches can even have different branch sets. Consider the dilogarithm, which has a very nice power series in the unit disk:

{\displaystyle f(z) = \sum_{n=1}^\infty \frac{z^n}{n^2} = z + \frac{z^2}{4} + \frac{z^3}{9} + \frac{z^4}{16} + \cdots}

The series even converges on the unit circle {|z|=1}, providing a continuous extension there. But this circle is also the boundary of the disk of convergence, so some singularity has to appear. And it does, at {z=1}. Going around this singularity and coming back to the unit disk, we suddenly see a function with a branch point at {z=0}, where there was no branching previously.

What gives? Consider the derivative:

{\displaystyle f'(z) = \sum_{n=1}^\infty \frac{z^{n-1}}{n} = -\frac{\log (1-z)}{z}}

As long as the principal branch of logarithm is considered, there is no singularity at {z} since {\log(1-0) = 0} cancels the denominator. But once we move around {z=1}, the logarithm acquires a multiple of {2\pi i }, and so {f'} gets an additional term {cz^{-1}}, and integrating that results in logarithmic branching at {z=0}.

Of course, this does not even begin the story of the dilogarithm, so I refer to Zagier’s expanded survey which has a few branch points itself.

Thus the dilogarithm is one of the simplest non-elementary functions one can imagine. It is also one of the strangest. It occurs not quite often enough, and in not quite an important enough way, to be included in the Valhalla of the great transcendental functions—the gamma function, Bessel and Legendre- functions, hypergeometric series, or Riemann’s zeta function. And yet it occurs too often, and in far too varied contexts, to be dismissed as a mere curiosity. First defined by Euler, it has been studied by some of the great mathematicians of the past—Abel, Lobachevsky, Kummer, and Ramanujan, to name just a few—and there is a whole book devoted to it. Almost all of its appearances in mathematics, and almost all the formulas relating to it, have something of the fantastical in them, as if this function alone among all others possessed a sense of humor.