Many aspects of language comprehension are closely time-locked to linguistic input. Thus psycholinguists have increasingly relied upon on-line experimental methods that provide fine-grained temporal information about language processing, e.g., monitoring eye movements during reading. However, most standard methods do not allow for continuous monitoring and cannot be easily adapted to natural situations. We are using the sensori-motor laboratory to monitor eye movements as subjects follow spoken instructions to manipulate objects (e.g., ``Put the apple that's on the towel in the box.''). With well-defined tasks, eye movements can illuminate the rapid mental processes that underlie spoken language comprehension. This approach can help explore topics ranging from recognition of spoken words to conversational interactions during cooperative problem solving. Preliminary experiments showed that participants process instructions incrementally, making saccadic eye movements to objects immediately after hearing relevant words in the instructions. When asked to touch one of four blocks differing in marking, color, or shape, with instructions such as ``Touch the starred yellow square,'' subjects made an eye movement to the target block an average of 250 ms after the end of the word that uniquely specified the target with respect to the visual alternatives (e.g., after ``starred'' if only one block was starred; after ``square'' if there were two starred yellow blocks). With more complex instructions, subjects made informative sequences of eye movements to objects relevant to establishing reference (e.g., ``Put the five of hearts that is below the eight of clubs above the three of diamonds.'').

Agency
National Institute of Health (NIH)
Institute
National Center for Research Resources (NCRR)
Type
Biotechnology Resource Grants (P41)
Project #
5P41RR009283-03
Application #
5225755
Study Section
Project Start
Project End
Budget Start
Budget End
Support Year
3
Fiscal Year
1996
Total Cost
Indirect Cost
Rothkopf, Constantin A; Ballard, Dana H (2013) Modular inverse reinforcement learning for visuomotor behavior. Biol Cybern 107:477-90
Fernandez, Roberto; Duffy, Charles J (2012) Early Alzheimer's disease blocks responses to accelerating self-movement. Neurobiol Aging 33:2551-60
Velarde, Carla; Perelstein, Elizabeth; Ressmann, Wendy et al. (2012) Independent deficits of visual word and motion processing in aging and early Alzheimer's disease. J Alzheimers Dis 31:613-21
Rothkopf, Constantin A; Ballard, Dana H (2010) Credit assignment in multiple goal embodied visuomotor behavior. Front Psychol 1:173
Huxlin, Krystel R; Martin, Tim; Kelly, Kristin et al. (2009) Perceptual relearning of complex visual motion after V1 damage in humans. J Neurosci 29:3981-91
Rothkopf, Constantin A; Ballard, Dana H (2009) Image statistics at the point of gaze during human navigation. Vis Neurosci 26:81-92
Jovancevic-Misic, Jelena; Hayhoe, Mary (2009) Adaptive gaze control in natural environments. J Neurosci 29:6234-8
Kavcic, Voyko; Ni, Hongyan; Zhu, Tong et al. (2008) White matter integrity linked to functional impairments in aging and early Alzheimer's disease. Alzheimers Dement 4:381-9
Droll, Jason A; Hayhoe, Mary M; Triesch, Jochen et al. (2005) Task demands control acquisition and storage of visual information. J Exp Psychol Hum Percept Perform 31:1416-38
Bayliss, Jessica D; Inverso, Samuel A; Tentler, Aleksey (2004) Changing the P300 brain computer interface. Cyberpsychol Behav 7:694-704

Showing the most recent 10 out of 28 publications