We live in a multisensory world, in which stimuli of various types constantly compete for our attention. Information about objects or events typically appears on more than one sensory channel, so that integrating inputs across sensory systems (e.g. vision and hearing) can enhance the signal-to-noise ratio and lead to more efficient perception and action. There is increasing interest in studying how stimulus properties in one sensory modality (e.g. vision) correspond to those in another modality (e.g. hearing). For instance, sounds of high pitch are linked to small-sized visual objects whereas sounds of low pitch are linked with large objects; sounds of high/low pitch are associated with, respectively, visual stimuli of high/low elevation; and even aspects of linguistic stimuli such as vowel quality are associated with visual properties such as object size. Such crossmodal correspondences are important factors in multisensory binding. While information has exploded on the kinds of stimulus features that are reliably associated by human observers across modalities, currently there is little neural evidence to allow a mechanistic account of how crossmodal correspondences arise, or how they relate to synesthesia, a phenomenon in which some individuals experience unusual percepts (e.g. colors) triggered by particular stimuli (e.g. letters. Our goal is to address these important gaps in knowledge, by using functional magnetic resonance imaging (fMRI) in humans to investigate the neural mechanisms underlying crossmodal and synesthetic correspondences and thus to distinguish between alternative explanations that have been offered. A number of possible mechanisms have been entertained for crossmodal correspondences. These include: Hypothesis A - learned associations due to statistical co-occurrences, which would predict that the correspondences are based in multisensory or even classic unisensory regions; Hypothesis B - semantic mediation (e.g. the common word high may mediate the link between high pitch and high elevation); and Hypothesis C - conceptual linking via a high-level property such as magnitude. In a series of eight experiments that comprise three Specific Aims, we propose to examine these competing accounts, recognizing that some or all of them may be operative, and that the mechanisms may vary between different types of crossmodal correspondences.
The proposed systematic study of the brain basis of correspondences between stimulus properties across sensory systems will allow critical insights into the multisensory processing involved in perception and action, illuminate the multisensory basis of language and music, and expand understanding of the phenomenon of synesthesia in relation to normal experience. From a practical standpoint, the proposed work will make significant contributions to the design of sensory substitution approaches for people with visual, auditory and other sensory deficits, and the rehabilitation of individuals with multisensory processing abnormalities, including developmental (autism, dyslexia), neurological (neglect) and psychiatric (schizophrenia) disorders.
|Zhao, Xinyu; Rangaprakash, D; Yuan, Bowen et al. (2018) Investigating the Correspondence of Clinical Diagnostic Grouping With Underlying Neurobiological and Phenotypic Clusters Using Unsupervised Machine Learning. Front Appl Math Stat 4:|
|McCormick, Kelly; Lacey, Simon; Stilla, Randall et al. (2018) Neural basis of the crossmodal correspondence between auditory pitch and visuospatial elevation. Neuropsychologia 112:19-30|
|Thompkins, Andie M; Deshpande, Gopikrishna; Waggoner, Paul et al. (2016) Functional Magnetic Resonance Imaging of the Domestic Dog: Research, Methodology, and Conceptual Issues. Comp Cogn Behav Rev 11:63-82|
|Lacey, Simon; Martinez, Margaret; McCormick, Kelly et al. (2016) Synesthesia strengthens sound-symbolic cross-modal correspondences. Eur J Neurosci 44:2716-2721|