The choice of where to look next entails selecting one particular motor action from a repertoire of available options, with the selection process being guided primarily by current perceptual information and, more indi- rectly, by internal factors such as motivation, current goals, previous experience, etc. The goal of this project is to develop and test a mechanistic framework for describing how perceptual and motor-planning processes dynamically interact and give rise to saccadic choices. What is partiuclarly ambitious about this project is that it aims to provide a parametrically detailed framework that is applicable to a large family of tasks while being tightly constrained by neurophysiology. In traditional studies of choice behavior, a decision based on a sen- sory stimulus is made ?rst and is then followed by a motor report. Under such conditions, a choice is often conceived as a serial process of perceptual evaluation followed by action selection, where the perceptual judg- ment (e.g., a color or motion discrimination) is relatively slow (?hundreds of ms). However, in the case of saccadic choices this scheme is rather misleading, because under natural viewing conditions the median time between gaze ?xations is rather short (200?250 ms), and the next saccade is always being planned. Based on urgency manipulations, recent work from our laboratory has uncovered many details about how perception and attention guide the choice process under more temporally realistic conditions, i.e., when the perceptual evaluation occurs rapidly (< 50 ms) and informs oculomotor plans that are already ongoing. By combining our urgent-choice paradigms with neurophysiological and theoretical results, we have developed a modeling framework that (1) is applicable to a wide range of saccadic choice tasks, (2) replicates rich psychophysical data with exquisite detail, and (3) is ?rmly consistent with the oculomotor activity observed in the frontal eye ?eld (FEF). Here we propose to develop and test this framework and its predictions with a variety of saccadic choice tasks to be performed by human subjects. These tasks give rise to psychometric measurements that are unique in their temporal resolution, and based on such measurements, we will investigate how exogenous (saliency- driven) and endogenous (rule-driven) spatial attention relate to oculomotor activity, how they interact with and differ from each other, and how their dynamics relate to individual differences in task performance be- tween participants. Using the computational model to ?t the data, we will test speci?c mechanistic hypotheses about visuomotor interactions. The key innovations of this project are, ?rst, that it investigates eye movements in the rapid timescale that is relevant for naturally occurring visuomotor behaviors; second, that it is integra- tive, i.e., it aims to synthesize numerous neurophysiological and psychophysical results into a small number of principles whereby perceptuo-motor interactions give rise to saccadic choices; and third, that it identi?es robust psychometric properties that can serve to characterize fundamental cognitive functions in both healthy and clinical populations.
This is a basic science project that proposes to investigate how visual information is used by the brain to de- cide where to look next. Human subjects will perform a variety of visual tasks, and their performance will be simulated in detail in a computer using a model that realistically replicates the neural activity recorded in oculo- motor areas of the cerebral cortex. The results will reveal how the deployment of spatial attention relates to the planning of eye movements, how attention varies over time, and how it differs between individual participants and, potentially, between healthy and clinical (e.g., ADHD) populations.