Language and reading impairments affect 16%-20% of US children, and are stable, persisting through adolescence and adulthood. Deficits in even low level skills like phonological processing and spoken word recognition persist through adulthood, and half of middle-school struggling readers show deficits in decoding and written word recognition. This proposal examines the development of spoken and written word recognition during late childhood. While words are a low-level language skill at these ages, they are central to language, linking phonology, orthography, and meaning. At a cognitive level, word recognition is seen as a competition process. As the input (e.g., wizard) is heard (or read) people consider multiple partially matching words (whistle, lizard) which compete over time. Prior work assessed this in children using a paradigm in which listeners match words to pictures while eye-movements are monitored. As listeners begin to hear a word their eyes move between candidates. These fixations reveal momentary consideration of alternative words and trace the dynamics of competition over milliseconds. We applied this to children, showing that competition is resolved more automatically between 9 and 16 years. Adolescents with language impairment showed a different pattern: they were similarly automatic, but did not fully resolve competition by the end of processing. This research documents that real-time processing develops, but it is unclear how. In older children, it is likely due to multiple causes such as vocabulary growth, the organization of phonological systems, the onset of reading instruction, and changes in executive function. This project examines the development and disorders in the automaticity and degree of competition resolution during lexical processing. It examines both spoken and written word processing to unpack the relationship between language and reading, and identify outcomes (good and poor) linked to differences in real-time processing.
The first aim i s to determine the cognitive and developmental factors that shape real-time word recognition, and the consequences of this for language and reading outcomes. We conduct an accelerated longitudinal study of 400 children (normal and impaired) between 7 and 12 combining eye-tracking measures of word recognition with tests of phonological processing, reading, vocabulary, and executive function.
The second aim uses cross-sectional laboratory studies to examine the consequences of differences in real-time processing for learning and for related processes like semantic processing (meaning) or orthographic decoding (mapping sound to print).
The third aim uses laboratory training procedures to understand plasticity in real-time lexical processing; this may pave the way for potential interventions targeting lexical processing. Finally, the fourth aim develops computational models of normal and disordered lexical processing to attain a deeper understanding of what mechanisms of language processing are changing with development or differ in disordered language users.

Public Health Relevance

HEALTH RELEVANCE Language and reading impairments affect 16% of children, and deficits in even low level skills like word recognition and word reading persist through the lifespan, leading to poor academic and life outcomes. This proposal investigates the development and disorders of spoken and written word recognition in school age children with both longitudinal and cross sectional studies that use a unique eye-tracking measure to reveal how children process words millisecond-by-millisecond. This will reveal mechanisms of plasticity that support more automatic and precise word recognition, knowledge that will help develop better interventions and assessments for struggling readers and children with language and reading impairments.

Agency
National Institute of Health (NIH)
Institute
National Institute on Deafness and Other Communication Disorders (NIDCD)
Type
Research Project (R01)
Project #
5R01DC008089-10
Application #
9824575
Study Section
Language and Communication Study Section (LCOM)
Program Officer
Cooper, Judith
Project Start
2007-01-08
Project End
2023-11-30
Budget Start
2019-12-01
Budget End
2020-11-30
Support Year
10
Fiscal Year
2020
Total Cost
Indirect Cost
Name
University of Iowa
Department
Psychology
Type
Schools of Arts and Sciences
DUNS #
062761671
City
Iowa City
State
IA
Country
United States
Zip Code
52242
Smith, Nicholas A; McMurray, Bob (2018) Temporal Responsiveness in Mother-Child Dialogue: A Longitudinal Analysis of Children with Normal Hearing and Hearing Loss. Infancy 23:410-431
McMurray, Bob; Ellis, Tyler P; Apfelbaum, Keith S (2018) How Do You Deal With Uncertainty? Cochlear Implant Users Differ in the Dynamics of Lexical Processing of Noncanonical Inputs. Ear Hear :
McMurray, Bob; Danelz, Ani; Rigler, Hannah et al. (2018) Speech categorization develops slowly through adolescence. Dev Psychol 54:1472-1491
Roembke, Tanja C; Wiggs, Kelsey K; McMurray, Bob (2018) Symbolic flexibility during unsupervised word learning in children and adults. J Exp Child Psychol 175:17-36
Kapnoula, Efthymia C; Winn, Matthew B; Kong, Eun Jong et al. (2017) Evaluating the sources and functions of gradiency in phoneme categorization: An individual differences approach. J Exp Psychol Hum Percept Perform 43:1594-1611
Samuelson, Larissa K; McMurray, Bob (2017) What does it take to learn a word? Wiley Interdiscip Rev Cogn Sci 8:
McMurray, Bob; Farris-Trimble, Ashley; Rigler, Hannah (2017) Waiting for lexical access: Cochlear implants or severely degraded input lead listeners to process speech less incrementally. Cognition 169:147-164
Apfelbaum, Keith S; McMurray, Bob (2017) Learning During Processing: Word Learning Doesn't Wait for Word Recognition to Finish. Cogn Sci 41 Suppl 4:706-747
Oleson, Jacob J; Cavanaugh, Joseph E; McMurray, Bob et al. (2017) Detecting time-specific differences between temporal nonlinear curves: Analyzing data from the visual world paradigm. Stat Methods Med Res 26:2708-2725
Rhone, Ariane E; Nourski, Kirill V; Oya, Hiroyuki et al. (2016) Can you hear me yet? An intracranial investigation of speech and non-speech audiovisual interactions in human cortex. Lang Cogn Neurosci 31:284-302

Showing the most recent 10 out of 51 publications