Virtually every visual and visuomotor function hinges first and foremost on the visual system's ability to consistently and accurately localize stimuli, and this is especially true in cluttered and dynamic scenes. However, it is unknown what sorts of information are integrated to determine perceived position, how such integration occurs, or what the neural mechanism(s) and loci of this process are. Perceptual localization in typically cluttered and dynamic scenes requires the visual system to both assign and update object positions; to do this it relies not just on the retinal location of the object of interest, but als principally on four additional factors: visual motion, frames of reference, eccentricity biases, an contrast adaptation. To understand how the visual system assigns and updates object positions, we must approach the task of localization not as an isolated process, but as an integrative one-one that depends on contextual information in the scene. Our proposed experiments have two goals; first, to psychophysically measure perceived position as a function of retinal position, contrast adaptation, visual motion, eccentricity bias, and frames of reference in order to generate a novel multifactorial integration field model; we will use this model in fMRI experiments to test the hypothesis that neurotopic organization across visual cortex is heterogeneous (unique position codes in each visual area) or homogeneous (identical across visual areas). Our pilot results suggest that there are unique position codes in different visual areas. The second goal of the proposal is to test whether these unique position codes (the differences in topographic organization) have perceptual consequences. We will use psychophysics to test these predicted double dissociations in perceived location and then use TMS to test the causal contribution of heterogeneous visual cortical topographic organization to perceived position. One novelty of our approach lies in developing a new mixed generative and discriminative model of spatial coding, that can be applied to psychophysical and fMRI data in tandem, and further allows us to make predictions from fMRI results about perceptual outcomes in specific situations. The causal relationship between fMRI results and perceptual outcomes will then be tested with TMS. Our experiments will provide novel insight on how cues are integrated to determine perceived position at each stage of visual processing, which is crucial to understanding the fundamental localization deficits that occur in a range of visual and cognitive impairments ranging from amblyopia and macular degeneration, to autism. Until we understand how position is assigned in the typical brain, we lack the necessary insight to develop diagnostic tools, predictive markers, and treatment outcome measures for these impairments.

Public Health Relevance

Perceptual localization is arguably the most important function of the visual system, but many neurological disorders produce specific deficits in visual and visuomotor localization, including amblyopia, optic ataxia, dyslexia, akinetopsia, autism, fragile X syndrome, and schizophrenia, among others. Until we understand how perceived position is determined in the normal brain for images that are constantly seen in cluttered, dynamic scenes, we lack the necessary insight to develop diagnostic tools, predictive markers, and therapies for the spatial component of these impairments.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY018216-08
Application #
8819544
Study Section
Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section (SPC)
Program Officer
Flanders, Martha C
Project Start
2007-04-01
Project End
2017-03-31
Budget Start
2015-04-01
Budget End
2017-03-31
Support Year
8
Fiscal Year
2015
Total Cost
Indirect Cost
Name
University of California Berkeley
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
124726725
City
Berkeley
State
CA
Country
United States
Zip Code
94704
Chen, Zhimin; Denison, Rachel N; Whitney, David et al. (2018) Illusory occlusion affects stereoscopic depth perception. Sci Rep 8:5297
Piazza, Elise A; Theunissen, Frédéric E; Wessel, David et al. (2018) Rapid Adaptation to the Timbre of Natural Sounds. Sci Rep 8:13826
Chen, Zhimin; Kosovicheva, Anna; Wolfe, Benjamin et al. (2018) Unifying Visual Space Across the Left and Right Hemifields. Psychol Sci 29:356-369
Manassi, Mauro; Liberman, Alina; Chaney, Wesley et al. (2017) The perceived stability of scenes: serial dependence in ensemble representations. Sci Rep 7:1971
Maus, Gerrit W; Duyck, Marianne; Lisi, Matteo et al. (2017) Target Displacements during Eye Blinks Trigger Automatic Recalibration of Gaze Direction. Curr Biol 27:445-450
Sweeny, Timothy D; Whitney, David (2017) The center of attention: Metamers, sensitivity, and bias in the emergent perception of gaze. Vision Res 131:67-74
Chen, Zhimin; Maus, Gerrit W; Whitney, David et al. (2017) Filling-in rivalry: Perceptual alternations in the absence of retinal image conflict. J Vis 17:8
Kosovicheva, Anna; Whitney, David (2017) Stable individual signatures in object localization. Curr Biol 27:R700-R701
Kiyonaga, Anastasia; Scimeca, Jason M; Bliss, Daniel P et al. (2017) Serial Dependence across Perception, Attention, and Memory. Trends Cogn Sci 21:493-497
Leib, Allison Yamanashi; Kosovicheva, Anna; Whitney, David (2016) Fast ensemble representations for abstract visual impressions. Nat Commun 7:13186

Showing the most recent 10 out of 69 publications