In addition to understanding the mechanisms by which signals about the visual world are produced in cortical neurons, it is important to establish which of these signals are used by the subject for perception. Classical techniques (lesions, microstimulation) have allowed such inferences to be made about areas of cortex or small groups of neurons, but these techniques do not have single neuron resolution. A powerful technique that has been successfully applied to single neurons has been to measure correlations between neuronal activity and reported sensation. In my laboratory, animals are trained to report perceived depth in a threshold disparity task. Even on the fraction of trials that contain no signal, the activity of individual neurons is correlated with the animals reports: on trials where the animal reports a near sensation, neurons with near preferred disparities fire more spikes that trials when the animal reports a far sensation, despite the fact that the visual stimuli are the same. This correlation is typically quantified with a non-parametric measure called Choice Probablility (CP), and these have been widely used in attempts to determine how different sensory neurons contribute to perception. A difficulty with this use of CP measurements is that because they only measure a correlation, they do not demonstrate causation. One possibility, that is the widely adopted interpretation, is that this correlation arises because the fluctuations in neuronal activity cause different perceptual states (bottom up account). Quantitative models based on this idea have been very successful in describing the data. But this success does not exclude a second class of model in which something associated with the animals perceptual state causes changes in the firing rate of sensory neurons (top-down). Although this problem has long been recognized, no experimental approaches had been able to differentiate these two descriptions. Previous work in this project has provided powerful, but indirect, evidence that at least part of this correlation reflects top-down signals, reflecting the animals choice. We have exploited perceptually bistable displays to demonstrate directly that perceptual state influences the activity of sensory neurons, in such a way that they reflect the current perceptual state. In these displays, a set of moving dots on a computer screen can produce a compelling sense of a transparent three-dimensional cylinder rotating about its own axis. However, the direction of rotation (which surface is the front surface) is not defined, and hence the direction perceived by the observer fluctuates. Importantly however, the percept is typically stable for a few seconds. Furthermore is it possible to make the percept stable by adding binocular disparity (this defines which is the front surface of the cylinder). If two such cylinders are presented simultaneously, there is a strong tendency to see both rotating in the same diretion, even if both are in fact amnbiguous. We exploited this by presenting two such cylinders, and adding a small disparity to only one of them, rendering it unambiguous. We confirmed in trained animals that this manipulation biases the perception of the cylinder that has no disparity. We then studied the activity of neurons in area MT during presentation of these stimuli. Only the cylinder with zero disparity was presented in the receptive field of a given neuron. Independent control experiments confirmed that disparity of the second cyclinder did not directly affect the activity of the neuron under study. However, when a zero disparity cylinder was present in the receptive field, so that the perceived rotation was altered by disparity in the second cylinder, the firing rate was sytematically increased in conditions that biased the animal to perceive the configuration normally preferred by the neuron under study. This therefore demonstrates that changes in perceived configuriation of a single stimulus cause changes in activity of sensory neurons.

National Institute of Health (NIH)
National Eye Institute (NEI)
Investigator-Initiated Intramural Research Projects (ZIA)
Project #
Application #
Study Section
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
U.S. National Eye Institute
Zip Code
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2017) Combining 1-D components to extract pattern information: It is about more than component similarity. J Vis 17:21
Tarawneh, Ghaith; Nityananda, Vivek; Rosner, Ronny et al. (2017) Invisible noise obscures visible signal in insect motion detection. Sci Rep 7:3496
Clery, Stephane; Cumming, Bruce G; Nienborg, Hendrikje (2017) Decision-Related Activity in Macaque V2 for Fine Disparity Discrimination Is Not Compatible with Optimal Linear Readout. J Neurosci 37:715-725
Read, Jenny C A; Cumming, Bruce G (2017) Visual Perception: Neural Networks for Stereopsis. Curr Biol 27:R594-R596
McFarland, James M; Cumming, Bruce G; Butts, Daniel A (2016) Variability and Correlations in Primary Visual Cortical Neurons Driven by Fixational Eye Movements. J Neurosci 36:6225-41
Cumming, Bruce G; Nienborg, Hendrikje (2016) Feedforward and feedback sources of choice probability in neural population responses. Curr Opin Neurobiol 37:126-132
Henriksen, Sid; Tanabe, Seiji; Cumming, Bruce (2016) Disparity processing in primary visual cortex. Philos Trans R Soc Lond B Biol Sci 371:
Quaia, Christian; Optican, Lance M; Cumming, Bruce G (2016) A Motion-from-Form Mechanism Contributes to Extracting Pattern Motion from Plaids. J Neurosci 36:3903-18
Sheliga, B M; Quaia, C; FitzGibbon, E J et al. (2016) Human short-latency ocular vergence responses produced by interocular velocity differences. J Vis 16:11
Sheliga, Boris M; Quaia, Christian; FitzGibbon, Edmond J et al. (2016) Ocular-following responses to white noise stimuli in humans reveal a novel nonlinearity that results from temporal sampling. J Vis 16:8

Showing the most recent 10 out of 32 publications