Although the visual world appears continuous and stable, visual information is actually sampled from the environment around 3 times per second. Observers must therefore process the contents of brief glimpses to form representations that can support effective behavior. Brief stimulus presentations have been crucial for illuminating the early stages involved in constructing scene representations. Very little is known about the time required to extract information about observer-to-object distances, however. There is a pressing need to understand these issues, because many factors, including real-world situations, visual impairment, normal aging, and neurological disorders, can place constraints on the time available for extracting and processing visual information. Thus, many people risk suffering consequences of poor object localization due to insufficient viewing time (e.g., falling, or colliding with objects when walking or driving). These consequences can be dire--the annual cost of falling, for example, is predicted to reach $54.9 billion in the next 10 years. There is a critical lack of knowledge about the consequences of insufficient viewing time on localization in distance, and this impedes identification of at-risk populations and slows development of evidence-based remediation plans. An important product of our investigation is that it will remove these critical barriers by quantifying the impact of insufficient viewing time on localization. This project's health relatedness thus derives from its ability to illuminate possible precursors to driving collisions and falling. Our long-term objectives are to characterize the time course of distance perception and to determine the psychological and neural mechanisms that govern this time course. We will address these issues using a novel, custom-built apparatus capable of providing very brief glimpses (e.g., 10 ms) of a real, 3D environment, followed by a masking image. After briefly glimpsing the environment, observers will use various methods (e.g., verbal report;blind walking) to indicate the egocentric distance of objects seen during the glimpse. This method allows us to study the factors that shape the early stages of distance perception. Experiments in this proposal will test our overarching hypothesis that both the stimulus-driven and top-down factors that govern early distance perception mechanisms are organized to confer a processing advantage for targets on the ground.
Our specific aims are to (1) determine the visual requirements for extraction of distance information from brief glimpses-focusing particularly on the powerful angular declination (height in the field) cue;(2) determine the top-down influences on extraction of distance information from brief glimpses--focusing on perceptual and cognitive biases related to the ground plane, and (3) confirm that our results are not crucially dependent upon one particular environment, but instead are more fundamental and broadly applicable to a variety of environments.
This project investigates the ability of people to localize objects seen during brief glimpses of the surrounding environment-a critically important skill, given the potentially devastating consequences of mislocalization when insufficient time is available to extract distance cues (e.g., collisions when walking or driving, falling, etc.). Limitations in the time available to extract distance information can arise from many factors (e.g., visual impairment, neurological disorders, normal aging, and situational constraints in everyday life). By finding out how perceptual and cognitive factors govern the speed with which people localize objects in the environment, our work promises to help improve efforts to minimize the tremendous personal and health care costs of falling and driving accidents in a broad range of populations.
Gajewski, Daniel A; Wallin, Courtney P; Philbeck, John W (2015) The Effects of Age and Set Size on the Fast Extraction of Egocentric Distance. Vis cogn 23:957-988 |
Malcolm, George L; Shomstein, Sarah (2015) Object-based attention in real-world scenes. J Exp Psychol Gen 144:257-63 |
Kramer, Benjamin A; Philbeck, John W; Dopkins, Stephen et al. (2015) Getting completely turned around: how disorientation impacts subjective straight ahead. Mem Cognit 43:143-50 |
Philbeck, John W; Witt, Jessica K (2015) Action-specific influences on perception and postperceptual processes: Present controversies and future directions. Psychol Bull 141:1120-44 |
Chichka, David; Philbeck, John W; Gajewski, Daniel A (2015) Tachistoscopic illumination and masking of real scenes. Behav Res Methods 47:45-52 |
Gajewski, Daniel A; Philbeck, John W; Wirtz, Philip W et al. (2014) Angular declination and the dynamic perception of egocentric distance. J Exp Psychol Hum Percept Perform 40:361-77 |
Gajewski, Daniel A; Wallin, Courtney P; Philbeck, John W (2014) Gaze behavior and the perception of egocentric distance. J Vis 14: |
Gajewski, Daniel A; Wallin, Courtney P; Philbeck, John W (2014) Gaze direction and the extraction of egocentric distance. Atten Percept Psychophys 76:1739-51 |
Sargent, Jesse Q; Zacks, Jeffrey M; Philbeck, John W et al. (2013) Distraction shrinks space. Mem Cognit 41:769-80 |
Gajewski, Daniel A; Philbeck, John W; Pothier, Stephen et al. (2010) From the most fleeting of glimpses: on the time course for the extraction of distance information. Psychol Sci 21:1446-53 |