We will investigate techniques for easy and interactive construction of high quality multisensory computer models of real, physical objects. By multisensory models we mean models that are capable of supporting multisensory interaction, supporting not only simulation of the appearance of an object but also its physical response, including associated forces and sounds. Because human interaction with the physical world is inherently multisensory, people will be able to effortlessly combine information from vision, hearing, and touch to manipulate such computer models.
Specifically, we propose to develop an integrated environment with excellent conditions for interactive modeling by humans. We will explore new techniques to make the modeling task extremely easy, by providing rapid feedback about the state of the model, developing novel sensing and display systems, and developing software tools to plan and generalize measurements. We will use these techniques to interactively create multisensory models of contact.