_Information Mechanics_ extends in a natural way the motivations and methods of information theory, which originally dealt primarily with a _static_ bookkeeping of information, to the study of _dynamical_ processes. As a system evolves in time, the parameters that account for the different forms of information encoded in it will evolve too, and one is naturally led to the study of the laws that govern such evolution. An important area of this research is the study of _reversible_ processes. Reversible processes are, by definition, "information-conserving." The generic property of "reversibility" expresses itself in a variety of specialized ways as further structure is introduced in the dynamical systems under study. Recent developments and applications show that this is a productive line of investigation, having a direct bearing on the theory and practice of computation and on the mathematical modeling of physics. Motivations and applications for this research are found, among others, in the following problems Models of computation compatible with microscopic physics; dissipation in computation. Invertible cellular automata, and cellular-automaton models of processes traditionally modeled by differential equations. Lattice gas models of fluid dynamics. Collective phenomena and phase transitions. Parallel computation; synchronization; special-relativistic effects in distributed computation. Quantum computation. Information-theoretical generalization of physical concepts, such as entropy, energy, temperature, and action. Generalization to computational processes of conservation laws and variational principles. The "domestication" of natural physical processes for human goals requires an appropriate understanding of these processes; thus, the domestication of energy conversion in large amounts (typified by the development of steam engines during the industrial revolution) entailed opening up a new chapter of physics, namely thermodynamics. Today we are just beginning to tap the _information_-processing (rather than _energy_-processing) resources of physics, say, by turning copper and silicon into computers. However, many modeling tasks of great conceptual or practical interest (e.g., in particle physics, meteorology, molecular biology, pattern recognition, optimization problems, etc.) demand amounts of computation vastly in excess of what can be achieved with the present level of exploitation of nature's intrinsic computational resources. As a new chapter of physics, _Information Mechanics_ addresses the question of how much more computing power is hidden within the finer machinery of nature.

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
8618002
Program Officer
Jolita D. Middleton
Project Start
Project End
Budget Start
1987-07-01
Budget End
1990-12-31
Support Year
Fiscal Year
1986
Total Cost
$236,160
Indirect Cost
Name
Massachusetts Institute of Technology
Department
Type
DUNS #
City
Cambridge
State
MA
Country
United States
Zip Code
02139