This is a homework that I should have done long ago, but unfortunately nobody assigned it to me. The previous post concerned a particular ODE system of Newtonian dynamics: force depends on position only, but is not necessarily conservative. In a formula, , where is a vector-valued function of time . (In this post I stop drawing arrows over vectors.) It is natural to rewrite this as a 1st order system. The new system as a Jacobian matrix of special form:
where consists of the partial derivatives of . When the force is conservative, the matrix is symmetric, and consequently is a Hamiltonian matrix. However, in general is not Hamiltonian, and I don’t know if such block matrices have a name at all.
What’s in a name? It’s the eigenvalues we are after. The first thing to observe is: is an eigenvalue of if and only if is an eigenvalue of . Indeed, suppose that is an eigenvector for . Here and are -dimensional vectors: it is natural to write vectors in such block form when dealing with block matrices. Performing block-by-block multiplication in the equation
we obtain . Thus, and ; put this together to get . All steps can be reversed: if is an eigenvalue of , then we can take such that , let , and thus construct an eigenvector for .
We now understand enough to into divide the situation into two cases.
Case 1. The matrix has an eigenvalue that does not belong to .
Case 2. All eigenvalues of are in (are nonpositive real numbers).
In Case 1 the matrix has at least one eigenvalue with positive real part, namely one of two square roots of an eigenvalue . Hence, the corresponding equilibrium is unstable.
In Case 2 all eigenvalues of have zero real part, and so the equilibrium is non-hyperbolic and we cannot determine its stability by looking at .
This raises the question: how can we tell whether has only nonpositive eigenvalues? Here is an easy necessary condition: the characteristic polynomial of must have nonnegative coefficients. Indeed, where for all . Multiplying out, we do not get negative numbers anywhere. This is a practical condition to check, since it does not require us solve the characteristic polynomial: we only look at its coefficients. However, it is not sufficient.
There is also a sufficient condition. Since changing the sign of a matrix flips the signs of all eigenvalues, we can just as well ask whether has only nonnegative eigenvalues. A sufficient condition is: all minors (submatrices) of have nonnegative determinant. Here is a longer explanation:
A matrix is totally positive if all of its minors have positive determinant. This involves minors of all sizes. For example, in a 3×3 matrix we would have to check that: each entry (1×1 minor) is positive; each of 9 2×2 minors has positive determinant, and the determinant of the whole matrix is positive. Quite a bit of work. The reward is great, however: all eigenvalues of a totally positive matrix are strictly positive. So there was (and still is) substantial interest in reducing the number of minors that need to be checked to verify that a given matrix is totally positive. Fomin’s minicourse on total positivity gives links to several resources on the subject of total positivity. Unfortunately the more accessible ones, such as Ando’s 1987 survey Totally positive matrices, are behind paywalls.
A matrix is totally nonnegative if all of its minors have nonnegative determinant. All eigenvalues of a totally nonnegative matrix are nonnegative. The sources of information are the same as for totally positive matrices.
Being totally nonnegative is a sufficient condition for having nonnegative eigenvalues, but of course not necessary: has two positive eigenvalues, for example.