The long-term goal of this research is to understand the basic neural computations that transform sensory signals into representations that support cognitive processes and guide subsequent actions. The research proposed here tests how signals flowing through the lateral intraparietal area (LIP) underlie processes of accumulating and remembering sensory evidence for the purposes of guiding eye-movements. We approach this topic by adapting a computational framework derived from previous work using a direction-discrimination paradigm, and extending it to a wider range of behavioral tasks, experimental phenomena, and theoretical approaches. Our primary goal is to test the degree to which the persistent and ramping activity in LIP reflects neural time-integration, and how this computation relates to sensory stimulation, working memory, and oculomotor planning. Our secondary goal is to better integrate the quantitative inferences available from the direction-discrimination paradigm and time-integration hypothesis to other work on oculomotor computations. We have previously demonstrated that neural activity in area LIP reflects the temporal accumulation of noisy sensory evidence during the performance of a motion direction decision- making task. Because LIP activity is driven strongly by the onset of such a visual target, it is necessary to test whether temporal integration in LIP depends on the presence of a visual stimulus within the response field. We will also assess the cognitive role of persistent activity in LIP. If the persistent activity in LIP supports the combination of multiple pieces of evidence acquired at different times, LIP activity during interleaved periods of stimulus viewing and working memory should reflect the time integral of the sensory evidence. We will test this hypothesis by incorporating a blank memory delay period between successive periods of sensory evidence, and evaluating whether activity during the delay period reflects the accumulated and maintained level of evidence. We will further investigate the relationship between LIP activity and cognitive function by performing a similar task in which two stimuli must be compared, instead of combined. In short, we will manipulate sensory, cognitive, and motor variables to assess the degree to which LIP activity appears to support time integration when: (1) visual stimulation of the response field is manipulated;(2) successive periods of accumulation and maintenance of sensory evidence occur;and (3) the specific eye-movement is not known during the formation of a decision. These measurements will also provide several opportunities to appreciate cell-by- cell diversity in sensory, memory, decision, and motor responses.

Public Health Relevance

Primate behavior is intelligent, flexible, and complex. These behavioral strengths often derive from the ability of human and nonhuman primates to combine multiple pieces of evidence acquired at different times. A thorough understanding of how the brain performs this """"""""temporal integration"""""""" will benefit neurological treatments, as well as inform technical applications involved in the control and design of intelligent prosthetics and robotics.

National Institute of Health (NIH)
National Eye Institute (NEI)
Research Project (R01)
Project #
Application #
Study Section
Central Visual Processing Study Section (CVP)
Program Officer
Steinmetz, Michael A
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
University of Texas Austin
Schools of Arts and Sciences
United States
Zip Code
Yates, Jacob L; Park, Il Memming; Katz, Leor N et al. (2017) Functional dissection of signal and noise in MT and LIP during decision-making. Nat Neurosci 20:1285-1292
Katz, Leor N; Yates, Jacob L; Pillow, Jonathan W et al. (2016) Dissociated functional significance of decision-related activity in the primate dorsal stream. Nature 535:285-8
Latimer, Kenneth W; Yates, Jacob L; Meister, Miriam L R et al. (2015) NEURONAL MODELING. Single-trial spike trains in parietal cortex reveal discrete steps during decision-making. Science 349:184-7
Park, Il Memming; Meister, Miriam L R; Huk, Alexander C et al. (2014) Encoding and decoding in parietal cortex during sensorimotor decision-making. Nat Neurosci 17:1395-403
Meister, Miriam L R; Hennig, Jay A; Huk, Alexander C (2013) Signal multiplexing and single-neuron computations in lateral intraparietal area during decision-making. J Neurosci 33:2254-67
Huk, Alexander C; Meister, Miriam L R (2012) Neural correlates and neural computations in posterior parietal cortex during perceptual decision-making. Front Integr Neurosci 6:86
Huk, Alexander C (2012) Multiplexing in the primate motion pathway. Vision Res 62:173-80
Czuba, Thaddeus B; Rokers, Bas; Huk, Alexander C et al. (2012) To CD or not to CD: Is there a 3D motion aftereffect based on changing disparities? J Vis 12:7
Eastman, Kyler M; Huk, Alexander C (2012) PLDAPS: A Hardware Architecture and Software Toolbox for Neurophysiology Requiring Complex Visual Stimuli and Online Behavioral Control. Front Neuroinform 6:1
Czuba, Thaddeus B; Rokers, Bas; Guillet, Kyle et al. (2011) Three-dimensional motion aftereffects reveal distinct direction-selective mechanisms for binocular processing of motion through depth. J Vis 11:18