In addition to understanding the mechanisms by which signals about the visual world are produced in cortical neurons, it is important to establish which of these signals are used by the subject for perception. Classical techniques (lesions, microstimulation) have allowed such inferences to be made about areas of cortex or small groups of neurons, but these techniques do not have single neuron resolution. A powerful technique that has been successfully applied to single neurons has been to measure correlations between neuronal activity and reported sensation. In my laboratory, animals are trained to report perceived depth in a threshold disparity task. Even on the fraction of trials that contain no signal, the activity of individual neurons is correlated with the animals reports: on trials where the animal reports a near sensation, neurons with near preferred disparities fire more spikes that trials when the animal reports a far sensation, despite the fact that the visual stimuli are the same. This correlation is typically quantified with a non-parametric measure called Choice Probablility (CP), and these have been widely used in attempts to determine how different sensory neurons contribute to perception. A difficulty with this use of CP measurements is that because they only measure a correlation, they do not demonstrate causation. One possibility, that is the widely adopted interpretation, is that this correlation arises because the fluctuations in neuronal activity cause different perceptual states (bottom up account). Quantitative models based on this idea have been very successful in describing the data. But this success does not exclude a second class of model in which something associated with the animals perceptual state causes changes in the firing rate of sensory neurons (top-down). Although this problem has long been recognized, no experimental approaches had been able to differentiate these two descriptions. Previous work in this project has provided powerful, but indirect, evidence that at least part of this correlation reflects top-down signals, reflecting the animals choice. We have now exploited structure-from-motion displays to demonstrate this effect more directly. In these displays, a set of moving dots on a computer screen can produce a compelling sense of a transparent three-dimensional cylinder rotating about its own axis. However, the direction of rotation (which surface is the front surface) is not defined, and hence the direction perceived by the observer fluctuates. Importantly however, the percept is typically stable for a few seconds. Furthermore is it possible to make the percept stable by adding binocular disparity (this defines which is the front surface of the cylinder). We have exploited these two features to control perception of the ambiguous cylinder on some trials. At the start of these trials binocular disparity renders the stimulus unambiguous, but after 500ms the disparity is set to zero, and the rotation continues for 1500ms. Human observers typically perceive the same direction of rotation throughout such trials the percept established by disparity during the first 500ms persists even when the disparity is removed. Behavioral measures in two monkeys confirm this phenomenon. The manipulation produces a set of trials in which identical stimuli are presented for 1500ms, and yet how these stimuli are perceived is under experimental control. The top-down framework above makes a very strong prediction about this situation on trials where the experimenter induces a perception of rotation in a neurons preferred direction, this should produce increased neuronal activity. This is exactly what we observed in neurons in area MT of awake animals performing this task. Quantitative comparisons with Choice Probability measures in the equivalent task suggest that CP largely reflects these top-down processes.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Investigator-Initiated Intramural Research Projects (ZIA)
Project #
1ZIAEY000404-10
Application #
8339770
Study Section
Project Start
Project End
Budget Start
Budget End
Support Year
10
Fiscal Year
2011
Total Cost
$655,563
Indirect Cost
Name
U.S. National Eye Institute
Department
Type
DUNS #
City
State
Country
Zip Code
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2018) Binocular summation for reflexive eye movements. J Vis 18:7
Seemiller, Eric S; Cumming, Bruce G; Candy, T Rowan (2018) Human infants can generate vergence responses to retinal disparity by 5 to 10 weeks of age. J Vis 18:17
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2017) Suppression and Contrast Normalization in Motion Processing. J Neurosci 37:11051-11066
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2017) Combining 1-D components to extract pattern information: It is about more than component similarity. J Vis 17:21
Tarawneh, Ghaith; Nityananda, Vivek; Rosner, Ronny et al. (2017) Invisible noise obscures visible signal in insect motion detection. Sci Rep 7:3496
Clery, Stephane; Cumming, Bruce G; Nienborg, Hendrikje (2017) Decision-Related Activity in Macaque V2 for Fine Disparity Discrimination Is Not Compatible with Optimal Linear Readout. J Neurosci 37:715-725
Joiner, Wilsaan M; Cavanaugh, James; Wurtz, Robert H et al. (2017) Visual Responses in FEF, Unlike V1, Primarily Reflect When the Visual Context Renders a Receptive Field Salient. J Neurosci 37:9871-9879
Read, Jenny C A; Cumming, Bruce G (2017) Visual Perception: Neural Networks for Stereopsis. Curr Biol 27:R594-R596
Henriksen, Sid; Tanabe, Seiji; Cumming, Bruce (2016) Disparity processing in primary visual cortex. Philos Trans R Soc Lond B Biol Sci 371:
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2016) A Motion-from-Form Mechanism Contributes to Extracting Pattern Motion from Plaids. J Neurosci 36:3903-18

Showing the most recent 10 out of 36 publications