Explosive development of computer technology outpaces theoretical understanding and leaves many of its pillars on shaky ground. As an extreme example, if it turns out that P=NP or that one-way functions do not exist, all of public-key cryptography and many tools crucial for national and personal security would be permanently incapacitated. In this (unlikely but possible) event, much of the computer infrastructure would have to be redesigned on a less impressive scale. On the other hand, many great advances cannot be foreseen even now due to insufficient theoretical understanding of basic issues.
Computations rarely run in deterministic isolation. They interact with users and adversaries, with random events called by algorithms or generated by the context, with delays and glitches from the system, hardware, and distributed infrastructure, etc. Some of these interactions are hard to model, but even those with straightforward mathematical models are often very hard to analyze. Randomness and non-determinism are two basic "freedoms" branching out of the concept of deterministic computation which play a crucial role in computing theory. Yet our understanding of their role and power is minimal. Even a gradual progress in understanding these phenomena and their relationship to each other and to other concepts would be important.
An example of achievements in this direction is the discovery of generic relationship between one-way functions and deterministic generation of randomness. Another is the concept of transparent (also called holographic, or PCP) proofs and computations. A number of interesting techniques useful for quite different results in these areas have been accumulated: low-degree polynomials and Fourier transforms over low-periodic groups, related to classical results on error-correcting codes and hashing, expander graphs, hierarchic structures, etc. The PI will continue his investigation of such concepts and of the power of these and other related techniques.
Symmetry is one of the central phenomena in many fields. In computation it can simplify analysis, provide uniformity and redundancy useful, e.g., for error-correction. On the other hand, it can cause indecisiveness, deadlocks and complicate initialization and organization of computing processes. Breaking symmetries is as essential a task as maintaining them. A study of a number of mathematical and algorithmic tools useful for symmetry breaking is intended. Examples include Thue sequences, aperiodic tilings, extensions of the concept of flat connections from manifolds to graphs, and others.
The work will advance discovery and understanding via dissemination of the results through talks, papers, and Web pages, PI's own and those of others. This research has implications in many close and remote areas. Indeed, the concepts the PI is interested in such as, e.g., randomness, are fundamental in a broad variety of fields of knowledge.
Our computing environment has outgrown the settings for which the classical concept of computability (Church's Thesis: computable tasks are only those with a recursive solution) was developed. Our computations are now neither deterministic, nor single-user; their goals do not always lie in producing a unique desirable output, etc., etc. A trivial example of inapplicability of the classical concept is finding long strings that cannot be compressed to a twice shorter code: a non-recursive task, yet easily solvable by as simple tools as coin-flips. A famous example of thoughts along these lines is Goedel's believe that despite his incompleteness theorem, informal arguments can eventually provide consistent answers to any math questions. To address these issues, recursivity must be replaced with a different solvability concept immune to such limitations of applicability. The version I consider is impossibility (by any means: formal, informal, primitive, sophisticated, interactive, whatever) of obtaining ANY significant information about SPECIFIC mathematically defined sequences. (Among its, tricky to prove, applications is refuting the above belief of Goedel.) To state this principle we must refine the Kolmogorov's concept of Algorithmic Information. Unlike related concepts of Algorithmic Complexity and Randomness, it still has basic difficulties. I advanced this theory further, providing better definitions with neater properties. We also studied applications of these areas to other problems, such as tilings. A side effect of these studies of tiling is a more transparent understanding of aperiodic tilings that may help in other areas, such as symmetry breaking for the purpose of self-stabilization of computing media. Another project we worked on concerns self-stabilizing, self-organizing, automata on arbitrary asynchronous networks. Such automata recover from any faulty initialization and then simulate efficiently any protocols designed for error-free synchronous networks. To boost efficiency, we reduce the problem to several simpler ones. The reduction is very tricky but remarkably minimalistic: each automaton can only store one bit and mark two of its edges. I worked on a number of other problems, but the above are representative examples I tried to express in a short non-technical statement as is now required. Useful results are usually quite involved and not easy to describe in non-technical terms.