Keevash on triangle decompositions

Today Peter Keevash finished four and a half hours of lectures on his latest improvement of his result on the existence of designs.

I have heard him talk on this before, but in a one-hour talk he had time to outline the strategy of the proof without going into much detail. That was not the case here, and he gave a riveting account of the many complications that had to be faced on the way to the theorem. Also, he has a much more general theorem now, but he chose to concentrate on one special case, which actually bears the same relationship to the new general theorem as does Kirkman’s construction of “Steiner triple systems” does to Keevash’s earlier theorem on the existence of t-designs for all t. If that sounds like I am belittling what he did, this is absolutely not the case, as will be clear to everyone there; rather, it covered many of the complexities of the general proof but kept the exposition within reasonable bounds.

Here I propose to give his four-point outline of the proof, and then describe some of the complexities of the four steps. But first I will say how the new theorem is much more general than the old one.

A t-(v,k,λ) design is a collection of k-sets of a v-set which cover every t-set exactly λ times; in other words, a partition of the hyperedges of the λ-fold complete t-hypergraph on v points into copies of the complete t-hypergraph on k points. In particular, a Steiner triple system is a partition of the edge set of the complete graph into triangles. The new results extend this from partitioning the edges of a complete uniform hypergraph to a much wider class of hypergraphs. The notation would become near to uncontrollable if he had talked about this proof; so instead he showed us how to partition the edges of a suitable graph into triangles.

The conditions on the graph are:

  • It should be “tridivisible”, that is, the number of edges should be divisible by 3, and all the vertex degrees should be even (these are the familiar necessary conditions)
  • The edge density should be at least some constant δ.
  • The graph should satisfy a strengthening of a property that the random graph has, which he called “typical”. Typicality takes two parameters c (a small positive real number) and h (a small positive integer), and asserts that the number of common neighbours of any set of at most h vertices should be between (1−c) and (1+c) times the expected number in a random graph with the same edge density. Peter showed us a proof with h = 16, but at the end indicated why h = 2 will suffice. (This is stronger than being random or pseudorandom: in those cases, only most sets of vertices are required to satisfy the condition, some larger deviations are permitted.)

The four steps in the proof are, briefly, the following:

  1. First, set aside a collection of triangles, called the template, for use at the end.
  2. Next, use the Rödl nibble argument to cover almost all of the remaining edges by triangles. The ones left over form the leave.
  3. Put the edges of the leave into triangles using some edges from the template triangles (which have then been used twice). The edges used for this form the spill.
  4. Finally, fix the twice-covered edges by finding a two sets A and B of triangles, where the spill together with the edges in A form the same set as the edges in B, and the triangles of B are template triangles; swap the spill and A for B.

(Added later: here is a picture (by Sebi Cioabă) illustrating the steps.)

Now just a brief note on the steps.

Step 1: randomly embed the set of n vertices of the graph into an elementary abelian 2-group of size a little larger than n (precisely, between two and four times n. Then the template consists of all the triangles xyz for which x+y+z = 0 in the abelian group.

Using concentration results (specifically, Azuma’s inequality), one shows that the graph formed by the edges of the template triangles is itself dense and typical, and the vertex degrees are not too large.

Step 2: The original Rödl nibble doesn’t quite work. The version needed, due to Frankl and Rödl, Pippenger, and Spencer, which gives a near-perfect matching in a hypergraph under suitable conditions. The hypergraph used as as vertices the edges of G not in template triangles, and as edges the sets of three edges of G forming a triangle. A near-perfect matching is a set of triangles covering most of the edges, as required.

Step 3: The edges of the leave (not covered in Step 2) are completed to triangles using edges from the template, ensuring that the pairs of edges used don’t overlap. This is done by a random greedy algorithm. If it were not for the disjointness condition, the choices would be independent, and the algorithm easy to analyse; the correct version turns out to be a fairly small perturbation of this. In this way, it is shown that the algorithm succeeds, and that no vertices of high degree are created.

Since “random greedy” is very important in Step 4, Peter covered this in considerable detail.

Step 4: The final fix. The basic idea here is that of an integral design (with multiplicities −1, 0, and 1, as in the work of Graver and Jurkat and of Wilson. But the context is a little different, since the spill edges must be dealt with, and so the work has to be done again from scratch.

Using typicality, it is possible to show that the “bad” edges can be embedded into subgraphs where a “switch” replaces the triangles by good ones. However, we have to be careful to get, not just triangles made of template edges, but actual template triangles! This is where the algebraic structure comes in.

Typicality, worked to the limit, shows that we can choose the configurations so that a new triangle xyz is embedded into an octahedron whose other vertices are x+y, y+z, and z+x (in the abelian group). Now the eight faces fall into two sets of four, the first containing xyz, and the second consisting of triangles like xy(x+y) (in the abelian group) which are indeed template triangles. This is done by using more care choosing the configurations mentioned in the preceding paragraph, so that each of these triangles sits inside such an octahedron. The configurations become sufficiently complicated that apparently the typicality condition is required for as many as 16 vertices. But, as Peter explained, typicality for sets of two vertices implies that the condition holds for almost all sets of 3, …, 16 vertices, and this is enough for the random greedy algorithm to get to work on.

A tour de force!

Advertisements

About Peter Cameron

I count all the things that need to be counted.
This entry was posted in Uncategorized and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s