Research in this laboratory has demonstrated that 4-month-old infants recognize the correspondence between auditorially and visually presented speech sounds. They recognize that particular sounds emanate from mouths moving in particular ways, thus demonstrating one of the components of """"""""lip-reading."""""""" Our current work shows that infants also relate nonspeech sounds to faces producing speech, and base this on their knowledge of speech. This finding led to the development of a hypothesis that accounts for our results of infants' cross-modal speech perception. The experiments proposed here extend our tests in four ways. First, following our new hypothesis, we continue studies on the basis of the effect, manipulating the visual stimuli (i.e., the faces) in these experiments. Second, we extend the studies on the development of the effect to include infants (1-14 months of age), children (3-year-olds), and adults. Third, we will initiate studies assessing the effects of visually presented rate-of-speech information. Fourth, using a new technique that isolates """"""""parts"""""""" of faces, we explore auditory-visual """"""""illusions"""""""" (the McGurk effect) in both adults and infants. The experimental outcomes are directly relevant to theories of speech perception and its development, as well as to theories of cognitive development. The data may also impact our understanding and treatment strategies for deaf of blind infants.