To avoid the complications from the preceding post, let’s assume that is a uniformly convex Banach space: such a space is automatically reflexive, and therefore any closed subspace has a well-defined nearest point projection .
Recalling that in a Hilbert space is a linear operator (and a self-adjoint one at that), we might ask if our is linear. Indeed, the invariance of distance under translations shows that for all . Consequently, all fibers are translates of one another. The map is also homogeneous: , which follows from the homogeneity of the norm. In particular, is a two-sided cone: it’s closed under multiplication by scalars.
In the special case we conclude that is a line, and the direct sum decomposition identifies as a (possibly skewed) linear projection.
Well, one of things that the geometry of Banach spaces teaches us is that 2-dimensional examples are often too simple to show what is really going on, while a 3-dimensional example may suffice. For instance, the and norms define isometric spaces in 2 dimensions, but not in 3 or more.
So, let’s take to be 3-dimensional with the norm , where . Let , so that the codimension of is 2. What is the set ? We know it is a ruled surface: with each point it contains the line through that point and the origin. More precisely, exactly when the minimum of is attained at . (The minimum point is unique, since the function is strictly convex.) Differentiation reveals that
which is a plane only when . Here is this surface for , when the equation simplifies to :
The entire 3-dimensional space is foliated by the translates of this surface in the direction of the vector .
The nearest point projection is likely to be the first nonlinear map one encounters in functional analysis. It is not even Lipschitz in general, although in decent spaces such as for it is Hölder continuous (I think the optimal exponent is ).
After a little thought, the nonlinearity of NPP is not so surprising: minimization of distance amounts to solving an equation involving the gradient of the norm, and this gradient is nonlinear unless the norm is a quadratic functions, i.e., unless we are in a Hilbert space.