Humans make decisions and perform actions in situations in which all aspects of the decision or action are potentially stochastic. There are fiv components to the planning of an action based on sensory information. First, the subject has prior information about the state of the environment including the current positions and velocities of nearby objects and of the subject's own body, which can be summarized as a probability distribution across possible world states. Second, the subject has sensory input about the current state of the environment, which is uncertain due to physical and neural noise. Third, these two sources of information are combined to decide on an intended action (button press, arm or eye movement, or a complex plan that includes responses to potential subsequent sensory inputs). Fourth, the resulting action can differ from the intended one due to motor noise. Finally, the interaction of the resulting action with the current environment leads to a consequence (a loss or gain), and this consequence may be uncertain as well. As a result of all these stochastic components, visual tasks and movement planning require a calculation that is equivalent to decision-making under risk. In our recent work, we have demonstrated that humans are nearly optimal in visuomotor tasks in that they maximize expected gain, and other circumstances in which human behavior is suboptimal. We propose experiments to better understand the nature of human behavior in visual and visuomotor tasks. We often use tasks with an experimenter-specified reward/penalty structure;this novel approach allows us to compare behavior with the optimal strategy that maximizes expected gain. We ask the following questions and propose experiments to address each. (1) How is behavior planned in visually guided movements? We will investigate the coordinate systems used to encode visually guided actions and how the encoding of movements affects the ability of the visuomotor system to adapt to changing conditions. (2) What does human performance in visual search tasks with clearly defined gains and losses tell us about the encoding of visual patterns and visual uncertainty? We will compare human performance in visual search tasks to ideal-observer models that maximize expected gain in situations with asymmetric payoffs. The results of these experiments will enable us to distinguish different hypotheses about the encoding of visual information in the periphery. In both aims we use patterns of visuomotor performance (while performing a reach, saccade, or keypress) to learn about the underlying encoding of visual stimuli, uncertainty, and visually guided movement. These studies will shed light on the way in which visual stimuli and movements are encoded, and on how vision is used to guide action.

Public Health Relevance

The proposed work benefits public health by characterizing the behavioral and neural mechanisms that are involved with making perceptual decisions or using sensory information to control movements. We show how optimal decisions and movement plans must take into account prior knowledge, the uncertainty of visual information, the variability of motor response and the consequences of action. A variety of medical conditions can impact both the reliability of visual information (e.g., cataract, amblyopia, etc.) and the quality of motor output and response to risk (e.g., Parkinson's disease, Huntington's disease, stroke). The proposed research will improve our understanding of how visual patterns and planned movements are encoded so as to optimiaze a perceptual decision or movement plan, and thus can serve to help in the design of rehabilitative plans when sensory input or motor output is disrupted (change in bias, gain and/or variability) by disease or other health- related conditions.

Agency
National Institute of Health (NIH)
Institute
National Eye Institute (NEI)
Type
Research Project (R01)
Project #
5R01EY008266-23
Application #
8464382
Study Section
Special Emphasis Panel (SPC)
Program Officer
Wiggs, Cheri
Project Start
1989-08-01
Project End
2015-04-30
Budget Start
2013-05-01
Budget End
2014-04-30
Support Year
23
Fiscal Year
2013
Total Cost
$286,414
Indirect Cost
$96,414
Name
New York University
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
041968306
City
New York
State
NY
Country
United States
Zip Code
10012
Ackermann, John F; Landy, Michael S (2015) Suboptimal decision criteria are predicted by subjectively weighted probabilities and rewards. Atten Percept Psychophys 77:638-58
Ackermann, John F; Landy, Michael S (2014) Statistical templates for visual search. J Vis 14:18
Westrick, Zachary M; Landy, Michael S (2013) Pooling of first-order inputs in second-order vision. Vision Res 91:108-17
Westrick, Zachary M; Henry, Christopher A; Landy, Michael S (2013) Inconsistent channel bandwidth estimates suggest winner-take-all nonlinearity in second-order vision. Vision Res 81:58-68
Ackermann, John F; Landy, Michael S (2013) Choice of saccade endpoint under risk. J Vis 13:
Hudson, Todd E; Landy, Michael S (2012) Adaptation to sensory-motor reflex perturbations is blind to the source of errors. J Vis 12:4
Vlaskamp, Bjorn N S; Yoon, Geunyoung; Banks, Martin S (2011) Human stereopsis is not limited by the optics of the well-focused eye. J Neurosci 31:9814-8
Doerschner, Katja; Maloney, Laurence T; Boyaci, Huseyin (2010) Perceived glossiness in high dynamic range scenes. J Vis 10:11
Doerschner, Katja; Boyaci, Huseyin; Maloney, Laurence T (2010) Estimating the glossiness transfer function induced by illumination change and testing its transitivity. J Vis 10:8.1-9
Zhang, Hang; Wu, Shih-Wei; Maloney, Laurence T (2010) Planning multiple movements within a fixed time limit: the cost of constrained time allocation in a visuo-motor task. J Vis 10:1

Showing the most recent 10 out of 88 publications