This post is a review of our initial results and bits of 170 years of history.
The title of our new paper is
Elliptic Kac–Sylvester Matrix from Difference Lamé Equation
It was published in the mathematical physics journal Annales Henri Poincaré. The article can be found on the publisher’s website at https://link.springer.com/article/10.1007/s00023-021-01063-y. The paper is freely available at https://rdcu.be/clsVe.
Sylvester’s tridiagonal determinant
This story starts in 1854 with James Joseph Sylvester‘s calculation of some tridiagonal determinants:
Notice the similarity with the matrix in eq. (1), numbers going up and down above and below the diagonal. The original Kac-Sylvester matrix reads as follows
It’s an matrix with the numbers below the diagonal, above it and everywhere else.
What Sylvester’s 1854 paper demonstrates is that the eigenvalues of in eq. (2) form an arithmetic progression .
While we are at it, some fun facts about Sylvester: 1. Sylvester was born James Joseph. The name ‘Sylvester’ was taken up when his brother emigrated to the US, where at the time one could only gain residence if one’s name had at least three parts… 2. Sylvester came up with names for many mathematical concepts, including the term matrix, which means womb in Latin. So a matrix is a thing that’s pregnant with numbers… 3. The previous two fun facts imply a third one, namely that Sylvester invented two out of the three words in the expression “Kac-Sylvester matrix”.
By the way, we don’t know exactly why Sylvester was interested in this particular matrix, but it’s clear that studying these types of tridiagonal determinants, called continuants, were a thing in the mid-19th century due to their relation to continued fractions.
The eigenvalues of the Kac-Sylvester matrix were computed by Sylvester in 1854 whilst the eigenvectors were only found about a century later in 1947 by Mark Kac. Before explaining Kac’ result, let me tell you about some interesting episodes that happened in the meantime.
The next time the Kac-Sylvester matrix showed up, it helped save Boltzmann from sceptics of his kinetic theory. In a nutshell, people thought that Boltzmann’s statistical mechanics contradicts the 2nd law of thermodynamics. And they were right…
For example, heat flowing from hot to cold in Boltzmann’s theory isn’t an exact law, but only true statistically. The purely mechanical motion of microscopic particles can result in lower entropy states. Here is a very nice illustration by Matt Henderson, the best math animator twitter has ever seen:
Entropy decrease (by Poincaré recurrence) was Zermelo‘s objection. Boltzmann agreed that this could take place, but correctly thought that it’d happen on timescales so large that we never experience it, hence the 2nd law seems exact.
The Ehrenfests and their dogs with fleas
To help make Boltzmann’s point, Tatiana and Paul Ehrenfest, in their 1907 paper, proposed the dog-flea model which is a simple model of heat exchange. Imagine two dogs standing close to each other with 100 fleas being shared between them.
If the fleas jump from one dog to the other at random you’d expect the fleas to spread roughly evenly (50:50) after a while. The probability P(n+1|n) of dog A going from having n fleas to n+1 fleas is (100-n)/100, while P(n-1|n) = n/100.
Arrange these probabilities in a (transition) matrix and voilà you have the rescaled Kac-Sylvester matrix of size 101×101. It’s rescaled, because each entry is divided by 100 to get probabilities.
And now the exciting part. The recurrence times are obtained from the equilibrium distribution which is encoded in the eigenvector with the largest eigenvalue. The distribution turns out to be the binomial distribution, so recurrence occurs on exponential timescales.
For example, the expected number of flea jumps required to return to the initial 90:10 state is . So if flea jumps occur every second, it’ll take about 2 billion years(!) to return to the initial state.
Now imagine that instead of 100 fleas, the dogs have of them (poor doggies) and you can quickly see just how improbable it is to return to the initial state.
Let’s recap! The rescaled Kac-Sylvester matrix is the transition matrix of the Ehrenfest model (dog-flea model). The left-eigenvector with eigenvalue 1 encodes the binomial distribution, that is we have
So what about the other eigenvectors? We’ll get to them shortly, but first Schrödinger.
Schrödinger’s failed attempt
Schrödinger’s 1926 papers on “Wave Mechanics” are legendary. In them, he solves various quantum models using his new equation. When he turns to the Stark effect, he encounters the symmetric Kac-Sylvester matrix, but struggles with it:
The above matrix is the symmetric version of Kac-Sylvester matrix which can be obtained from the original one by a similarity transformation. Therefore they have the same eigenvalues.
So Schrödinger could guess the eigenvalues of the Kac-Sylvester matrix, but couldn’t find a proof.
Mark Kac’ result
Mark Kac is most famous for his 1966 popular article Can One Hear the Shape of a Drum? for which he received the Chauvenet Prize, the highest award for mathematical expository writing. It’s less known that this was the 2nd time Kac got this prize.
Kac’ first Chauvenet Prize was awarded for his 1947 paper Random Walk and the Theory of Brownian Motion in which he described, among other things, a particle’s random walk along the integers between and with a spring fixed at pulling on it.
Kac found the eigenvectors of the Kac-Sylvester matrix. These are given by the Krawtchouk polynomials
which are a family of discrete orthogonal polynomials (OPs) with the binomial distribution as weight function, i.e. we have
The connection to OPs is not a coincidence…
Quick aside: Mark Kac has written an excellent biography titled “Enigmas of Chance”. It’s well worth a read. Here is a bit about the Kac-Sylvester problem.
Tridiagonal matrices and orthogonal polynomials
The link between tridiagonal matrices and orthogonal polynomials is well-known, see here.
The crux is that any family of orthogonal polynomials satisfies recurrence relations of the form
which include 3 consecutive OPs and thus can be encoded in a tridiagonal matrix.
This talk by Askey has the details and is full of nice historical facts.
And now, some fun facts about orthogonal polynomials: 1. The Hermite polynomials were first define by Laplace in 1810 when Hermite was -12 years old. 2. In 1940 Wigner introduced the 6j-symbols in quantum mechanics, but he didn’t realize they can be viewed as (discrete) orthogonal polynomials until decades later when Askey told him.
Lamé’s equation: differential vs difference
Lamé’s differential equation appears when solving the Laplace equation by separation of variables in ellipsoidal coordinates. Here Lamé’s equation is written in terms of Weierstrass’ elliptic function (A,B are constants):
And this is the difference Lamé equation
The ingredients are Jacobi’s elliptic theta function , a complex parameter (coupling constant) , a shift step size (Compton wavelength) , the unknown eigenvalue (energy level) and the unknown eigenfunction (wave function) .
Choosing the (real) period of the theta function and the domain of x carefully, turns the difference Lamé equation into the eigenvalue problem of a tridiagonal matrix. Can you guess what this mystery matrix looks like? Bingo! It’s the elliptic Kac-Sylvester matrix we’ve seen in eq. (1). Notation: .
Our main result
We expressed the eigenvectors of the elliptic Kac-Sylvester matrix as discrete orthogonal polynomials. These new OPs are elliptic generalizations of the Krawtchouk (and Rogers) polynomials. Thus we extended many of the old results mentioned in this post.
Finally, let me give you a hard(?) open problem: Find the eigenvalues of elliptic Kac-Sylvester matrix. If you could explicitly formulate the eigenvalues in terms of the theta function for arbitrary matrix sizes, you would make a mathematical discovery!
This was our elliptic Kac-Sylvester paper in a nutshell. If you are interested in seeing the details, you can read the full paper for free at rdcu.be/clsVe .
I might do more posts in the near future since we have some new results that are even more exciting than the one I’ve just described.