To generate a random number uniformly distributed on the interval , one can keep tossing a fair coin, record the outcomes as an infinite sequence of 0s and 1s, and let . Here is a histogram of samples from the uniform distribution… nothing to see here, except maybe an incidental interference pattern.

Let’s note that where has the same distribution as itself, and is independent of . This has an implication for the (constant) probability density function of :

because is the p.d.f. of and is the p.d.f. of . Simply put, is equal to the convolution of the rescaled function with the discrete measure .

Let’s iterate the above construction by letting each be uniformly distributed on instead of being constrained to the endpoints. This is like tossing a “continuous fair coin”. Here is a histogram of samples of ; predictably, with more averaging the numbers gravitate toward the middle.

This is not a normal distribution; the top is too flat. The plot was made with this Scilab code, putting n samples put into b buckets:

```
n = 1e6
b = 200
z = zeros(1,n)
for i = 1:10
z = z + rand(1,n)/2^i
end
c = histplot(b,z)
```

If this plot too jagged, look at the cumulative distribution function instead:

It took just more line of code: `plot(linspace(0,1,b),cumsum(c)/sum(c))`

Compare the two plots: the c.d.f. looks very similar to the left half of the p.d.f. It turns out, they are **identical** up to scaling.

Let’s see what is going on here. As before, where has the same distribution as itself, and the summands are independent. But now that is uniform, the implication for the p.d.f of is different:

This is a direct relation between and its antiderivative. Incidentally, if shows that is infinitely differentiable because the right hand side always has one more derivative than the left hand side.

To state the self-similarity property of in the cleanest way possible, one introduces the cumulative distribution function (the Fabius function) and extends it beyond by alternating even and odd reflections across the right endpoint. The resulting function satisfies the delay-differential equation : the derivative is a rescaled copy of the function itself.

Since vanishes at the even integers, it follows that at every dyadic rational, all but finitely many derivatives of are zero. The Taylor expansion at such points is a polynomial, while itself is not. Thus, is **nowhere analytic** despite being everywhere .

This was, in fact, the motivation for J. Fabius to introduce this construction in 1966 paper *Probabilistic Example of a Nowhere Analytic -Function*.

Looks very nice! Guess, you’ll find

this https://people.math.osu.edu/edgar.2/selfdiff/

this http://www.madore.org/~david/weblog/d.2014-09-15.2223.html#d.2014-09-15.2223 [in french]

and this http://atomic-functions.ru/en/main/

interesting.