This research will investigate the fundamental mechanisms by which human observers perceive and remember the 3D structures of smoothly curved surfaces. The project will address two basic questions: First, how are objects in 3-space represented mentally; and second, what are the properties of optical structure which determine this knowledge perceptually. It is clear from our perceptual experience that visual images on the retina provide sufficient information for us to perceive adequately the 3D structure of the environment; yet it is equally clear upon closer reflection that the properties of visual images seem to have little in common with the properties of real objects. Real objects exist in 3D space and are composed of tangible substances such as earth, metal, or flesh, while an image of an object is confined to a 2D projection surface and consists of nothing more than flickering patterns of light. Although the problem of how human observers are able to deal with this seemingly incommensurate mapping between objects and images is an ancient one, research in this area has been given a new impetus in recent years by attempts to develop artificial visual systems for robots and prosthetic devices for the blind. This project will attempt to facilitate these efforts by providing a more detailed understanding of how similar problems are solved by the human visual system. In order to model rigorously the processes of 3D form perception, it is first necessary to define precisely what those processes accomplish for us. Most previous investigations in this area have assumed that each visible surface point is encoded perceptually in terms of its metric depth relative to the point of observation. This research, in contrast, is designed to explore a much wider variety of potential representations. It builds on earlier findings that perceptual judgments of metric depth are surprisingly inaccurate and stems from the hypothesis, supported by the earlier results, that visual knowledge of 3D structure may often involve a more abstract form of representation in which an observed surface is perceptually encoded in terms of its nonmetric properties (e.g., those involving ordinal or nominal relations). A variety of experiments, using both natural and computer generated images, will test this hypothesis. They will require people to make various types of judgments requiring different levels of knowledge about 3D structure. Other experiments will identify some of the specific computational mechanisms through which perceptual representations of 3D form are generated from optical information, and measure the stability of these mechanisms over a wide range of viewing conditions.

Agency
National Science Foundation (NSF)
Institute
Division of Behavioral and Cognitive Sciences (BCS)
Application #
8908426
Program Officer
Jasmine V. Young
Project Start
Project End
Budget Start
1989-09-01
Budget End
1994-02-28
Support Year
Fiscal Year
1989
Total Cost
$205,919
Indirect Cost
Name
Brandeis University
Department
Type
DUNS #
City
Waltham
State
MA
Country
United States
Zip Code
02454