Classic psychophysical vision research has focussed on attentive vision. An observer is directed to attend and respond to a specific stimulus in a specific location. Most of vision, however, is preattentive. Preattentive mechanisms process a few features in parallel over the entire visual field. Thus, a preattentive color processor could find a red item independent of any green items. Data from the PI's lab and others show that preattentive mechanisms parse the visual input into preattentive objects. Attention selects one or, perhaps, a few of those objects for further analysis by attentive processes. The PI has also shown that the perceptual consequences of attention do not persist when attention is deployed to a new object. If the effects of attention are limited to a handful of objects, the rest of vision must be preattentive. Thus, any thorough understanding of vision must include an understanding of the psychophysics of preattentive vision and of the interactions of preattentive and attentive visual processes. While progress has been made, compared to the psychophysics of attentive vision, preattentive psychophysics is very crude. For example, while the investigator may know that preattentive motion processing can discriminate up from down, we know little about the shape of the channels that process motion prior to the arrival of attention. He will improve this situation with a program of research having four specific aims: 1) He will use new and promising visual search methods to reveal the structure of the channels that support preattentive processing of visual features including orientation and motion. 2) He will determine if selection by attention is required for all visual search tasks or if the outputs of preattentive processes by themselves are adequate to perform the simplest of search tasks (e.g. a search for an item of unique color). 3) Given the evidence that attention selects objects and not merely spatial locations, there must be preattentive obejcts for attention to select. He will investigate the properties of those preattentive objects. Are an object's top and bottom and/or internal structure represented? Further, we will study an important neuropsychological patient whose bilateral parietal lobe lesions appear to reduce his perceptual world to the current object of attention. 4) Finally, our Guilded Search model has succeeded in describing the interactions of preattentive and attentive visual processes within the context of visual search tasks. He will expand the Guided Search model to handle additional data (e.g. eye movements in visual search and the """"""""attentional blink"""""""").
Palmer, Evan M; Horowitz, Todd S; Torralba, Antonio et al. (2011) What are the shapes of response time distributions in visual search? J Exp Psychol Hum Percept Perform 37:58-71 |
Wolfe, Jeremy M; Palmer, Evan M; Horowitz, Todd S (2010) Reaction time distributions constrain models of visual search. Vision Res 50:1304-11 |
Wolfe, Jeremy M; Reijnen, Ester; Van Wert, Michael J et al. (2009) In visual search, guidance by surface type is different than classic guidance. Vision Res 49:765-73 |
Wolfe, Jeremy M; Horowitz, Todd S; Van Wert, Michael J et al. (2007) Low target prevalence is a stubborn source of errors in visual search tasks. J Exp Psychol Gen 136:623-38 |
Wolfe, Jeremy M; Horowitz, Todd S; Michod, Kristin O (2007) Is visual attention required for robust picture memory? Vision Res 47:955-64 |
Wolfe, J M; Klempen, N; Dahlen, K (2000) Postattentive vision. J Exp Psychol Hum Percept Perform 26:693-716 |
Wolfe, J M; Bennett, S C (1997) Preattentive object files: shapeless bundles of basic features. Vision Res 37:25-43 |
Wolfe, J M (1995) The pertinence of research on visual search to radiologic practice. Acad Radiol 2:74-8 |
Bilsky, A B; Wolfe, J M (1995) Part-whole information is useful in visual search for size x size but not orientation x orientation conjunctions. Percept Psychophys 57:749-60 |
Wolfe, J M; Friedman-Hill, S R; Bilsky, A B (1994) Parallel processing of part-whole information in visual search tasks. Percept Psychophys 55:537-50 |