Studies in humans have largely paralleled those described above in monkeys, but are limited by inherent properties of the lab's mechanical capabilities, given the larger payload of moving humans. These studies are therefore limited to motion in the head's horizontal plane. Humans, however, provide unique opportunities to control the context in which experiments take place, including such subtleties as the tracking of imaginary targets in darkness, and to assess perceptual variables in addition to reflex behavior. We have now characterized the important features of the human LVOR and its interactions with vision during interaural oscillation across a frequency range, 0.5D4 Hz. Horizontal eye movement responses are robust, binocular, and modulated by viewing distance, as measured by vergence (angle between the eyes). The relationship between response amplitude and vergence in humans is remarkably close to that in monkeys, and shows a similar frequency-dependence. The presence of earth-fixed or headfixed targets provides a contextthat can be used to enhance or suppress the LVOR response, even in darkness (with imagined targets). However, this influence is only robust at relatively low frequencies, and follows the response dynamics typical of visual tracking. The characteristics of visual and contextual influences resemble those in the AVOR, which we have also included in detail in our subjects. We have also investigated the perception of motion in order to compare eye movement responses to psychophysical measures. A joystick task provided a measure of perceived translation. We found that the sensation of linear velocity was surprisingly transient, lasting only seconds. In contrast, when subjects were asked to respond proportionally with linear displacement in space, considerably more accurate performance was observed. These disparate results can only mean that non-vestibular cues are being used to confer improved position sense.

Agency
National Institute of Health (NIH)
Institute
National Center for Research Resources (NCRR)
Type
Biotechnology Resource Grants (P41)
Project #
5P41RR009283-08
Application #
6481653
Study Section
Project Start
2001-08-01
Project End
2002-07-31
Budget Start
Budget End
Support Year
8
Fiscal Year
2001
Total Cost
Indirect Cost
Name
University of Rochester
Department
Type
DUNS #
208469486
City
Rochester
State
NY
Country
United States
Zip Code
14627
Rothkopf, Constantin A; Ballard, Dana H (2013) Modular inverse reinforcement learning for visuomotor behavior. Biol Cybern 107:477-90
Velarde, Carla; Perelstein, Elizabeth; Ressmann, Wendy et al. (2012) Independent deficits of visual word and motion processing in aging and early Alzheimer's disease. J Alzheimers Dis 31:613-21
Fernandez, Roberto; Duffy, Charles J (2012) Early Alzheimer's disease blocks responses to accelerating self-movement. Neurobiol Aging 33:2551-60
Rothkopf, Constantin A; Ballard, Dana H (2010) Credit assignment in multiple goal embodied visuomotor behavior. Front Psychol 1:173
Huxlin, Krystel R; Martin, Tim; Kelly, Kristin et al. (2009) Perceptual relearning of complex visual motion after V1 damage in humans. J Neurosci 29:3981-91
Rothkopf, Constantin A; Ballard, Dana H (2009) Image statistics at the point of gaze during human navigation. Vis Neurosci 26:81-92
Jovancevic-Misic, Jelena; Hayhoe, Mary (2009) Adaptive gaze control in natural environments. J Neurosci 29:6234-8
Kavcic, Voyko; Ni, Hongyan; Zhu, Tong et al. (2008) White matter integrity linked to functional impairments in aging and early Alzheimer's disease. Alzheimers Dement 4:381-9
Droll, Jason A; Hayhoe, Mary M; Triesch, Jochen et al. (2005) Task demands control acquisition and storage of visual information. J Exp Psychol Hum Percept Perform 31:1416-38
Bayliss, Jessica D; Inverso, Samuel A; Tentler, Aleksey (2004) Changing the P300 brain computer interface. Cyberpsychol Behav 7:694-704

Showing the most recent 10 out of 28 publications