Temporal frequency is a fundamental sensory domain that is critically important to how we communicate (e.g., speech processing by audition) and interact with objects in our environment (texture processing by touch). Our auditory and tactile senses redundantly signal temporal frequencies spanning tens to hundreds of cycles per second. This overlap enables audition and touch to interact, which can be beneficial because the information available by combining across independent sensory cues is more accurate than that provided by a single sensory cue. Despite this background, we do not have a clear understanding of the relationship between auditory and tactile frequency processing mechanisms. Previous investigations of the neural substrates supporting audition and touch have traditionally focused on a single sensory modality. Here we will test the hypothesis that common brain regions and neural mechanisms, termed supramodal, support auditory and tactile frequency processing. We will develop a computational model of how sensory neurons may combine auditory and tactile frequency information and how these neurons may be changed by adaptation. We will compare the model's predictions to behavioral data acquired in human psychophysical experiments. Using blood oxygen-level dependent functional magnetic resonance imaging (BOLD fMRI) and sensory adaptation, we will localize brain regions whose response patterns are consistent with neural adaptation, and we will use this approach to test whether brain regions represent (i.e., adapt to) both auditory and tactile frequency information. Using fMRI and multivariate pattern analysis (MVPA), we will identify the brain regions from which auditory and tactile frequency information can be decoded. We will determine whether regions support decodable frequency representations for both senses. Our preliminary modeling, psychophysics, and imaging results suggest that multiple regions in perisylvian cortex, including areas classically defined as unimodal, display frequency-selective responses to both auditory and tactile stimulation. This pattern suggests that perisylvian areas may serve as a supramodal network for frequency processing. We hypothesize that attention to vibration frequency enhances the functional connectivity in this frequency network. We will causally probe functional connectivity between somatosensory cortex and auditory cortex by combining transcranial magnetic stimulation (TMS) with fMRI (in concurrent TMS-fMRI experiments) and behavior (in psychophysical experiments). According to our hypothesis, we predict that neural changes caused by TMS of somatosensory cortex should propagate to auditory cortex when subjects attend to vibration frequency. This propagation should modulate auditory cortex activity and auditory perception. These predicted results would support the notion that classically defined somatosensory and auditory areas collaborate to process temporal frequency information as a supramodal network. Supramodal networks may support other fundamental sensory operations like shape and motion processing.

Public Health Relevance

Our senses of hearing and touch can sometimes signal the same information. This project investigates the possibility that audition and touch are supported by the same brain circuitry involved in processing frequency information. Characterizing the relationship between brain mechanisms supporting audition and touch can motivate novel approaches for rehabilitation.

Agency
National Institute of Health (NIH)
Institute
National Institute of Neurological Disorders and Stroke (NINDS)
Type
Research Project (R01)
Project #
5R01NS097462-02
Application #
9312323
Study Section
Mechanisms of Sensory, Perceptual, and Cognitive Processes Study Section (SPC)
Program Officer
Gnadt, James W
Project Start
2016-07-15
Project End
2021-06-30
Budget Start
2017-07-01
Budget End
2018-06-30
Support Year
2
Fiscal Year
2017
Total Cost
Indirect Cost
Name
Baylor College of Medicine
Department
Neurosciences
Type
Schools of Medicine
DUNS #
051113330
City
Houston
State
TX
Country
United States
Zip Code
77030
Crommett, Lexi E; Madala, Deeksha; Yau, Jeffrey M (2018) Multisensory perceptual interactions between higher-order temporal frequency signals. J Exp Psychol Gen :
Pérez-Bellido, Alexis; Anne Barnes, Kelly; Crommett, Lexi E et al. (2018) Auditory Frequency Representations in Human Somatosensory Cortex. Cereb Cortex 28:3908-3921
Convento, Silvia; Rahman, Md Shoaibur; Yau, Jeffrey M (2018) Selective Attention Gates the Interactive Crossmodal Coupling between Perceptual Systems. Curr Biol 28:746-752.e5
Pérez-Bellido, Alexis; Pappal, Ryan D; Yau, Jeffrey M (2018) Touch engages visual spatial contextual processing. Sci Rep 8:16637
Crommett, Lexi E; Pérez-Bellido, Alexis; Yau, Jeffrey M (2017) Auditory adaptation improves tactile frequency perception. J Neurophysiol 117:1352-1362