This project developed a system that allows manipulation of complex displays in real time, contingent on the eye position of an observer. The goal of these experiments is to examine the nature of the representations that guide visual behavior during normal tasks, involving movements of the eyes and coordination of the eyes and hand. These experiments extend the analysis of visual operations to processes operating over a longer time scale than is usually considered, with an unprecented degree of stimulus control. This now allows the analysis of visual representations moment by moment. The other innovative aspect of the technique is that by measuring the consequences of the visual manipulation on task performance (e.g., by measuring fixation duration), we have a much more sensitive indicator of the experimental manipulation than is normally available. The experimental methodology has been developed over the last two years, and our results in the current grant period reveal that vision may be much more task-dependent than previously thought. Thus different fixations on the same visual stimulus serve a different purpose. The results also indicated that the visual information that is retained across successive fixations depends on moment-by-moment task demands and is used to minimize the amount of working memory. Recent Progress. The software for virtual displays of the Baufix parts has been installed. This was made available to us through our collaboration with Sagerer at the University of Bielefeld.

Agency
National Institute of Health (NIH)
Institute
National Center for Research Resources (NCRR)
Type
Biotechnology Resource Grants (P41)
Project #
5P41RR009283-05
Application #
6122963
Study Section
Project Start
1998-09-23
Project End
1999-07-31
Budget Start
1997-10-01
Budget End
1998-09-30
Support Year
5
Fiscal Year
1998
Total Cost
Indirect Cost
Name
University of Rochester
Department
Type
DUNS #
208469486
City
Rochester
State
NY
Country
United States
Zip Code
14627
Rothkopf, Constantin A; Ballard, Dana H (2013) Modular inverse reinforcement learning for visuomotor behavior. Biol Cybern 107:477-90
Velarde, Carla; Perelstein, Elizabeth; Ressmann, Wendy et al. (2012) Independent deficits of visual word and motion processing in aging and early Alzheimer's disease. J Alzheimers Dis 31:613-21
Fernandez, Roberto; Duffy, Charles J (2012) Early Alzheimer's disease blocks responses to accelerating self-movement. Neurobiol Aging 33:2551-60
Rothkopf, Constantin A; Ballard, Dana H (2010) Credit assignment in multiple goal embodied visuomotor behavior. Front Psychol 1:173
Huxlin, Krystel R; Martin, Tim; Kelly, Kristin et al. (2009) Perceptual relearning of complex visual motion after V1 damage in humans. J Neurosci 29:3981-91
Rothkopf, Constantin A; Ballard, Dana H (2009) Image statistics at the point of gaze during human navigation. Vis Neurosci 26:81-92
Jovancevic-Misic, Jelena; Hayhoe, Mary (2009) Adaptive gaze control in natural environments. J Neurosci 29:6234-8
Kavcic, Voyko; Ni, Hongyan; Zhu, Tong et al. (2008) White matter integrity linked to functional impairments in aging and early Alzheimer's disease. Alzheimers Dement 4:381-9
Droll, Jason A; Hayhoe, Mary M; Triesch, Jochen et al. (2005) Task demands control acquisition and storage of visual information. J Exp Psychol Hum Percept Perform 31:1416-38
Bayliss, Jessica D; Inverso, Samuel A; Tentler, Aleksey (2004) Changing the P300 brain computer interface. Cyberpsychol Behav 7:694-704

Showing the most recent 10 out of 28 publications