For a decade, it has been recognized that successful comprehension of language is facilitated by mental simulation, i.e., the mental re-enactment of real-world perceptual and motor experiences. This simulation occurs while understanding language because linguistic knowledge is grounded in our experiences.

However, studies investigating the role of simulation in formulating utterances are very limited in both their number and methodology. Using a promising new experimental method that requires hand/arm motions during language production, this dissertation examines whether motor activities affect cognitive processes and ultimately influence the shape of the resulting language. That is, do speakers change their language choice as a result of related hand/arm movements? This study examines both English and Japanese. The inclusion of Japanese, a language with relatively flexible word order, allows critical insight to simulation in sentence production since word order can reflect the temporal order of stages in the simulated event, in addition to choice of linguistic content and speech onset time.

Two sets of experiments are conducted in two languages. The first set explores the interaction between motor activities and unconstrained messages (e.g., the relationships among the event's participants and objects are not yet established). The second set investigates the relationship between motion and the production of a fixed message. The timing of executing physical motions is manipulated in each set of experiments in order to explore the time course of integrating motor activities into meaning construction processes. Results, from two typologically distinct languages, showing that meaning construction and human language production are sensitive to motor actions would suggest that our language is cognitively grounded and acquired through a tight bond with our past experiences. This project will help shed light on the influence of motor actions on the human cognitive system, and more generally on the relationships among linguistic and non-linguistic cognition.

Project Report

Previous research has argued that we understand sentences by mentally recreating the events described in them (e.g., Barsalou, 1999). For example, we comprehend "John closed the drawer" by connecting it to previous experiences with closing drawers or witnessing drawers being closed. Such research has demonstrated that reading sentences such as "John closed the drawer" can affect the speed of immediately following bodily motions, such as extending the arm to press a keyboard button, which involves similar motor actions to closing a drawer (e.g., Glenberg & Kaschak, 2002). However, before this project was initiated there had been almost no work investigating whether non-linguistic motor actions can influence how we produce sentences. The results of this project suggest they can. The project involved a series of experiments on how bodily motions affect sentence production in Japanese and English, testing native speakers of each language. Specifically, the project examined whether moving a computer mouse toward or away from the body had effects on: the speed with which the speaker began uttering a sentence, the content used to discuss a set of pictures, and, in Japanese, the word order chosen by the speaker. (Word order is much more flexible in Japanese than in English.) Importantly, the bodily motions were cued by non-linguistic information, and there was no apparent relationship to speakers between the mouse movements and the language that speakers had to produce or the pictures they had to discuss. The results showed that speakers were more likely to use language that described motion toward the speaker if they had just moved their arm toward their own body, and more likely to use away-language after extending their arm to move the mouse away from their body. Speakers were also faster to initiate speech when the direction of events described in the sentence matched the direction of arm movements. In Japanese, there was an additional effect of a relationship between bodily motion, toward- versus away-language, and word order, so that the order in which concepts were mentioned tended to reflect the order in which a person would interact with the objects in the physical world. Overall the results indicate that our speech can be influenced by recent non-linguistic bodily motions that were not intended to affect the speaker’s message. These results have implications for our understanding of how languages are learned and used, and for the connections among language, perception, and action.

Project Start
Project End
Budget Start
2010-02-15
Budget End
2012-01-31
Support Year
Fiscal Year
2009
Total Cost
$6,316
Indirect Cost
Name
University of Hawaii
Department
Type
DUNS #
City
Honolulu
State
HI
Country
United States
Zip Code
96822