Perception has long been known to reflect the brain's ability to pool information from many different senses simultaneously. Therefore, it is surprising to note that comparatively little is known about the neural integration that takes place in the cortical neurons believed to underlie these perceptions, and virtually nothing is known about how their physiological properties are crafted during ontogeny. In the current proposal, we posit that early postnatal experience plays a critical role in determining the nature of these multi-sensory processes and their consequent impact on perception and behavior. These postulates will be examined here using neurons in the anterior ectosylvian sulcus (AES) and AES-mediated detection/identification and localization behaviors as models. The AES was targeted because of its high incidence of multi-sensory neurons, the prior demonstration that these neurons can integrate cross-modal information, and the involvement of this area in perception/behavior. In the first phase of this project we will examine the normal chronology with which somatosensory, auditory, visual and multi-sensory AES neurons develop, as well as the maturation of their ability to integrate cross-modal information. In the second phase of the project, we will utilize a multidisciplinary approach to examine what appears to be the most parsimonious hypothesis regarding the necessary and sufficient experiential antecedents of this fundamental information-processing capability: specifically, repeated experience with temporally coincident cross-modal (e.g., visual-auditory) cues. The proposed experiments will also examine whether these visual-auditory experiences must be obtained during early postnatal stages, or whether cross-modal experiences are capable of defining and modifying these neural processes throughout life.

National Institute of Health (NIH)
National Institute of Mental Health (NIMH)
Research Project (R01)
Project #
Application #
Study Section
Integrative, Functional and Cognitive Neuroscience 8 (IFCN)
Program Officer
Babcock, Debra J
Project Start
Project End
Budget Start
Budget End
Support Year
Fiscal Year
Total Cost
Indirect Cost
Wake Forest University Health Sciences
Anatomy/Cell Biology
Schools of Medicine
United States
Zip Code
Noel, Jean-Paul; Stevenson, Ryan A; Wallace, Mark T (2018) Atypical audiovisual temporal function in autism and schizophrenia: similar phenotype, different cause. Eur J Neurosci 47:1230-1241
Stevenson, Ryan A; Baum, Sarah H; Segers, Magali et al. (2017) Multisensory speech perception in autism spectrum disorder: From phoneme to whole-word perception. Autism Res 10:1280-1290
Stevenson, Ryan A; Sheffield, Sterling W; Butera, Iliza M et al. (2017) Multisensory Integration in Cochlear Implant Recipients. Ear Hear 38:521-538
Nidiffer, Aaron R; Stevenson, Ryan A; Krueger Fister, Juliane et al. (2016) Interactions between space and effectiveness in human multisensory performance. Neuropsychologia 88:83-91
Murray, Micah M; Lewkowicz, David J; Amedi, Amir et al. (2016) Multisensory Processes: A Balancing Act across the Lifespan. Trends Neurosci 39:567-579
Krueger Fister, Juliane; Stevenson, Ryan A; Nidiffer, Aaron R et al. (2016) Stimulus intensity modulates multisensory temporal processing. Neuropsychologia 88:92-100
Noel, Jean-Paul; Wallace, Mark; Blake, Randolph (2015) Cognitive neuroscience: integration of sight and sound outside of awareness? Curr Biol 25:R157-9
Altieri, Nicholas; Stevenson, Ryan A; Wallace, Mark T et al. (2015) Learning to associate auditory and visual stimuli: behavioral and neural mechanisms. Brain Topogr 28:479-93
Noel, Jean-Paul; Pfeiffer, Christian; Blanke, Olaf et al. (2015) Peripersonal space as the space of the bodily self. Cognition 144:49-57
Ghose, D; Wallace, M T (2014) Heterogeneity in the spatial receptive field architecture of multisensory neurons of the superior colliculus and its effects on multisensory integration. Neuroscience 256:147-62

Showing the most recent 10 out of 34 publications