This project focuses on two areas where combinatorics and other parts of mathematics meet: correlation inequalities, and applications of entropy to combinatorial problems. Correlation inequalities deal with positive and negative reinforcement among events in a probability space (e.g. how does knowing event A has occurred affect the probability of event B?). Questions of this type are fundamental for probability and statistical mechanics, and also play important roles in, e.g., statistics, computer science and discrete mathematics. Nonetheless, despite a great deal of activity over the last forty-five years, some basic insights seem to be lacking, and even some of the simplest questions remain open. The project contains a mix of old and new problems, some quite specific, others more general, but all intended to push in the direction of some of those missing insights. Shannon entropy, introduced in 1948 for coding-theoretic purposes, has, mostly in the last ten or fifteen years, turned out to be a valuable tool for certain kinds of combinatorial extremal problems. (For example: how many independent sets---i.e. sets of vertices spanning no edges---can one have in a regular graph with given degree and number of vertices?) The project considers a number of problems of this type, each thought to be both of independent interest and likely to force further development of entropy techniques.

One theme common to the two parts of the project is an interest in applying ideas and methods across mathematical boundaries, meaning, on the one hand, bringing the principal investigator's combinatorial perspective to bear on problems from other areas that have mostly been considered by specialists in those areas, and, on the other, using extra-combinatorial ideas to attack combinatorial problems. Many of the problems proposed seem quite hard, but also quite fundamental, as evidenced in particular by the fact that they arise in so many disparate contexts. (This last refers especially to the first part of the project, but not entirely; for instance, though recent applications have been mostly combinatorial, the PI's original contributions to entropy methods were developed to attack problems in probability and statistical mechanics.) It is also true that the difficulties underlying some of these problems appear to show up elsewhere, for instance in the now very active area of "Markov chain Monte Carlo," and in "Mason's Conjecture," a celebrated, 35-year-old problem in matroid theory. So it seems likely that progress on some of the present questions would have further repercussions.

Project Report

The grantee works mostly on combinatorial problems. Some of these are problems arising in other areas (e.g. probability, statistical physics, theoretical computer science). They are usually problems that have some history of intractability, and the ideas used to attack them are often drawn from other mathematical disciplines. A few of the successes from this grant period are: 1. The resolution of ``Shamir's Problem," which asks (e.g.) for the intensity at which a ``perfect packing" appears in a system of random triples. This had been one of the basic---and most intensively studied---problems of probabilistic combinatorics since it was proposed more than thirty years ago. (Actually Shamir's Problem is just an example of the full result, which also, for instance, resolves a whole class of conjectures on factors in random graphs that had been open for more than twenty years.) Ideas from probability and information theory are among the underpinnings of this work. One of the uses to which it has since been put (by other researchers) is the first correct proof of Zuk's Theorem, a major result in the (superficially unrelated) theory of random groups. 2. Asymptotic determination---that is, up to lower order terms---of the number of 3-SAT functions. (Such functions are a basic construct in the theory of computation.) The far easier case of 2-SAT had been settled earlier, though not easily, but there had been essentially no progress beyond this. One key ingredient here is the ``hypergraph regularity lemma," a powerful tool originally developed to attack problems in number theory, again a subject quite different from the one under discussion 3. Determination of the thresholds (or, as above,``intensities") at which a random graph satisfies (a) Mantel's Theorem and (b) equality of cycle and triangle spaces. Thresholds, which are closely related to the phase transitions of statistical mechanics, are the most basic objects of study in the theory of random graphs. The results mentioned, each of which again settles a problem that had been open for some time (more than 20 years in the first case) belong to one of the central combinatorial trends of recent years: an attempt to understand the extent to which some of the subject's fundamental results hold in a random setting. (Mantel's Theorem of 1907 is generally considered the result that initiated the now huge field of extremal graph theory. The problem in (b) is the first case of a general conjecture about the homological behavior of the ``clique complex" of a random graph, a problem from topology rather than combinatorics.)

Agency
National Science Foundation (NSF)
Institute
Division of Mathematical Sciences (DMS)
Application #
0701175
Program Officer
Tomek Bartoszynski
Project Start
Project End
Budget Start
2007-06-01
Budget End
2013-05-31
Support Year
Fiscal Year
2007
Total Cost
$459,737
Indirect Cost
Name
Rutgers University
Department
Type
DUNS #
City
New Brunswick
State
NJ
Country
United States
Zip Code
08901