Let’s say that a set contains an singular matrix if there exists an matrix with determinant zero whose entries are distinct elements of . For example, the set of prime numbers does not contain any singular matrices; however, for every it contains infinitely many singular matrices.
I don’t know of an elementary proof of the latter fact. By a 1939 theorem of van der Corput, the set of primes contains infinitely many progressions of length 3. (Much more recently, Green and Tao replaced 3 with an arbitrary .) If every row of a matrix begins with a 3-term arithmetic progression, the matrix is singular.
In the opposite direction, one may want to see an example of an infinite set that contains no singular matrices of any size. Here is one:
Let : these numbers are 1 less than Fermat’s should-have-been-primes. Suppose that an matrix is filled with distinct elements of . Then
where each product over is a power of 2. Moreover, these powers of 2 are distinct by the uniqueness of binary representation of integers. It follows that the sum over is nonzero — again, by the uniqueness of binary representation.
This set is very sparse, because the sequence grows super-exponentially. Is there such an example with exponential growth? or even with polynomial growth?
Disclaimer: This being a weekend blog post (and not a research proposal), I did not study the literature.
Credit: Once again, to Colin Carroll for “Prime Matrices I,II”.