Computer science and engineering have made great strides in building high-performing software and hardware systems. Further progress on the hardware front requires multiple processors to work in parallel on the same task. To make adequate use of the parallel hardware, the software needs to be parallelized as well - it needs to specify how to break up the task among the various processors. The fact that a given problem may have several solutions often complicates software parallelization. This is because the processors do not have much time to coordinate among themselves. Without coordination, they may be working towards different, incompatible solutions. Isolation is a strategy to ensure all processors work towards the same solution. It also has a wide range of other algorithmic uses. This project focuses on the power of isolation, which is the process of singling out a solution to a computational problem that may have many solutions. Though fundamental in nature and aimed at developing the underlying theory, the project may lead to practical improvements, e.g., for computational problems that involve detecting similarities between certain types of structures. Graduate training and education are core to the project.
The project consists of several thrusts that center around the notion of isolation: (1) Derandomizing known isolation procedures for problems that capture various models of computation. Known procedures are based on the Isolation Lemma, which assigns small random weights to the components of a solution so as to make the solution of minimum total weight unique. The project aims to reduce the number of random bits needed and ultimately remove the need for randomness completely while maintaining efficiency.
(2) Developing deterministic or randomized isolation procedures for well-studied intermediate problems, namely, isomorphism problems on graphs and more expressive structures. This relates to a number of known open questions regarding these problems, including the connection with testing rigidity of structures and with finding a canonical form for the structures.
(3) Refuting the Unique Games Conjecture, a central conjecture in the area of hardness of approximation with ties to several other mathematical fields. The conjecture states that approximating the optimal yield of strategies for so-called label cover games is as hard for cases that satisfy a certain isolation-like property as it is in general.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.