A key problem in programming languages is the design and implementation of a balance between static (compile-time) and dynamic (run-time) phases of computation. A good balance allows the generation of good code without an undue burden on programmers or compilers. Finding this balance relies critically on intuition and experimentation, but these methods are not enough by themselves. As type systems and optimizing transformations have become more sophisticated, it has become essential to provide rigorous means to express static/dynamic relationships at suitable levels of abstraction. Programmers and compiler writers rely on these relationships, so a precise but approachable description of them is important. Using intuition and experimentation alone may allow errors or valuable opportunities for generalization to go unnoticed. The goal of this project is to develop and apply techniques for rigorously expressing and proving the relationships between static and dynamic semantics of programming languages. Primary objectives include (i) finding ways to specify meanings at levels of abstraction suitable for reasoning about sharing, copying, and liveness; (ii) developing ways to represent and manipulate semantic specifications in a formal manner; (iii) aiding the development of new interface definition languages by formulating techniques for describing the properties that compatibility of modules guarantees for systems that integrate them.