This research is on efficient algorithms for: (i) normalization, i.e., simplification of expressions, using rules (called a rewrite system), and (ii) fundamental properties of rewrite systems. Algorithms that use analysis for normalization with static rules are being developed. Incremental algorithms for normalization with dynamic rules are also being investigated. Extensions of efficient tabling algorithms are being studied. Practical performance of algorithms is being evaluated in the Laboratory for Rapid Rewriting test bed developed at University of Houston. Efficient algorithms and lower bounds are being studied for several fundamental properties, including uniqueness of normal forms, confluence, and the word problem, for decidable subclasses. Tight relationships among these problems are being studied using the concept of resource-bounded reductions.
Normalization is a fundamental operation found in virtually all symbolic computation and computer algebra systems. Applications include functional and equational logic programming, data type specification, formal verification, automated deduction, code generation, and type inferencing. This research is expected to yield: (i) enhanced efficiency and power of symbolic computing and computer algebra systems, (ii) better understanding of inherent complexities of the involved operations, and (iii) new results, techniques and insights into fundamental properties of rewrite systems. Broader impacts include: training of undergraduate and graduate students including underrepresented minorities, new course materials based on this research, and dissemination of software developed for extensions and use.