Project III examines the remarkable flexibility of phonological behavior that allows its transmission by various optical signals, extending our primary claims that reading is founded on extracting phonological form from print and that phonological structure is gestural. Talking faces, printed words, pictures, and signaling hands are all optical signals that provide access to phonological information, but they differ in the completeness and directness with which they reveal it. Whereas Projects I-II focus primarily on acoustic speech signals, Project III begins with the observation that the articulatory gestures of speech also have consequences in dynamic facial movements, which directly but incompletely reflect focal tract movements. Our first specific aim examines how speech perceivers extract phonological form from those dynamic facial patterns. Printed words, by contrast, provide relatively complete phonological information, but it is conveyed indirectly via an arbitrary, learned code. While Projects IV-VI probe the phonological basis for reading per se, our second specific aim here is to evaluate whether the phonological form extracted from print is in common with that of speech and whether both are gestural. Pictures offer access (albeit indirect and imprecise) to words as phonological forms via semantic association rather than orthographic encoding. Their inadequacies differ from those of faces. Our third specific aim compares phonological access via picture, printing, face and speech. If there is a common phonology, individual performance differences should co-vary across perception of speech and these various optical signals. Our four specific aim explores co-variation in performance in accessing phonological forms from speech, print and faces. Sign languages use a very different optical signal offering direct and complete information about the phonological structure of signs. Yet no written are in wide use, so within-sign literacy cannot interact with its phonological structure. Our fifth specific aim tests how signers extract phonological information from signs, in tasks similar to those used for spoken languages. The results will inform us about the multiple instantiations of phonological behavior beyond the acoustic speech signal. Such knowledge of central theoretical and clinical significance.

Agency
National Institute of Health (NIH)
Institute
Eunice Kennedy Shriver National Institute of Child Health & Human Development (NICHD)
Type
Research Program Projects (P01)
Project #
5P01HD001994-37
Application #
6564620
Study Section
Special Emphasis Panel (ZHD1)
Project Start
2002-02-01
Project End
2003-01-31
Budget Start
1998-10-01
Budget End
1999-09-30
Support Year
37
Fiscal Year
2002
Total Cost
$176,978
Indirect Cost
Name
Haskins Laboratories, Inc.
Department
Type
DUNS #
060010147
City
New Haven
State
CT
Country
United States
Zip Code
06511
Xia, Zhichao; Zhang, Linjun; Hoeft, Fumiko et al. (2018) Neural Correlates of Oral Word Reading, Silent Reading Comprehension, and Cognitive Subcomponents. Int J Behav Dev 42:342-356
Earle, F Sayako; Landi, Nicole; Myers, Emily B (2018) Adults with Specific Language Impairment fail to consolidate speech sounds during sleep. Neurosci Lett 666:58-63
Schmidtke, Daniel; Van Dyke, Julie A; Kuperman, Victor (2018) Individual variability in the semantic processing of English compound words. J Exp Psychol Learn Mem Cogn 44:421-439
Ryherd, K; Jasinska, K; Van Dyke, J A et al. (2018) Cortical regions supporting reading comprehension skill for single words and discourse. Brain Lang 186:32-43
Patael, Smadar Z; Farris, Emily A; Black, Jessica M et al. (2018) Brain basis of cognitive resilience: Prefrontal cortex predicts better reading comprehension in relation to decoding. PLoS One 13:e0198791
Landi, Nicole; Malins, Jeffrey G; Frost, Stephen J et al. (2018) Neural representations for newly learned words are modulated by overnight consolidation, reading skill, and age. Neuropsychologia 111:133-144
Hong, Tian; Shuai, Lan; Frost, Stephen J et al. (2018) Cortical Responses to Chinese Phonemes in Preschoolers Predict Their Literacy Skills at School Age. Dev Neuropsychol 43:356-369
Siegelman, Noam; Bogaerts, Louisa; Kronenfeld, Ofer et al. (2018) Redefining ""Learning"" in Statistical Learning: What Does an Online Measure Reveal About the Assimilation of Visual Regularities? Cogn Sci 42 Suppl 3:692-727
Olmstead, Annie J; Viswanathan, Navin (2018) Lexical exposure to native language dialects can improve non-native phonetic discrimination. Psychon Bull Rev 25:725-731
Hendren, Robert L; Haft, Stephanie L; Black, Jessica M et al. (2018) Recognizing Psychiatric Comorbidity With Reading Disorders. Front Psychiatry 9:101

Showing the most recent 10 out of 457 publications