# The Khintchine inequality

Today’s technology should make it possible to use the native transcription of names like Хинчин without inventing numerous ugly transliterations. The inequality is extremely useful in both analysis and probability: it says that the $L^p$ norm of a linear combination of Rademacher functions (see my post on the Walsh basis) can be computed from its coefficients, up to a multiplicative error that depends only on $p$. Best of all, this works even for the troublesome $p=1$; in fact for all $0. Formally stated, the inequality is

$\displaystyle A_p\sqrt{\sum c_n^2} \le \left\|\sum c_n r_n\right\|_{L^p} \le B_p\sqrt{\sum c_n^2}$

where the constants $A_p,B_p$ depend only on $p$. The orthogonality of Rademacher functions tells us that $A_2=B_2=1$, but what are the other constants? They were not found until almost 60 years after the inequality was proved. The precise values, established by Haagerup in 1982, behave in a somewhat unexpected way. Actually, only $A_p$ does. The upper bound is reasonably simple:

$\displaystyle B_p=\begin{cases} 1, \qquad 0

The lower bound takes an unexpected turn:

$\displaystyle A_p=\begin{cases} 2^{\frac{1}{2}-\frac{1}{p}},\qquad 0

The value of $p_0$ is determined by the continuity of $A_p$, and is not far from $2$: precisely, $p_0\approx 1.84742$. Looks like a bug in the design of the Universe.

For a concrete example, I took random coefficients $c_0...c_4$ and formed the linear combination shown above. Then computed its $L^p$ norm and the bounds in the Khintchine inequality. The norm is in red, the lower bound is green, the upper bound is yellow.

It’s a tight squeeze near $p=2$