This project examines the motor control capabilities of subjects with disordered, normal, and exceptional speech. Both speech and nonspeech tasks will be used for testing. The nonspeech tasks are based on a visuomotor tracking paradigm in which subjects track predictable and unpredictable targets with the lower lip, jaw, larynx, and (for the first time in these studies) the chest wall. Targets will be varied not only for frequency, as in previous studies, but also displacement. Predictable targets will be tracked with and without visual feedback. Speech tasks will be used to examine movement patterns of the lips, jaw, larynx, and chest wall during accurate and inaccurate speech productions. A particular emphasis in the analyses of the kinematic data will be temporal coordination, especially temporal coherence, within and between speech- production structures during speech. Performance on the nonspeech task will be correlated with perceptual, acoustic, and kinematic data during speech. Ultimately, this research should aid in the development of diagnostic and treatment tasks that focus on the specific underlying motor control problems associated with particular speech disorders.
Showing the most recent 10 out of 151 publications