In daily life, we frequently experience correlated sensations across our different sensory modalities. For example, as we climb up the stairs, we receive sensations to our auditory, visual, tactile, and vestibular systems that are all related to the experience of stair-climbing. These types of multisensory experiences are a key aspect of how we interact with, and learn about, the world around us. Through our experience with the world, our sensory abilities undergo refinements that allow us to optimize our performance in the tasks that we perform. These sensory refinements involve fine-tuning of processing in each of our sensory modalities, but equally importantly, in how we merge information across modalities. While much research has focused on how learning can take place within each individual sensory system, the learning of how information is combined across the senses has been largely neglected. The PIs will conduct a series of experiments in which they can track visual, auditory, and auditory-visual multisensory learning in parallel, and discriminate among different theories of multisensory processing and learning. Behavioral and neuroimaging methods will be combined to shed light on the roles that different brain areas, and the interactions between brain areas, play in the process of multisensory learning. Altogether these studies will provide fundamental insights into how our sensory systems work together and refine their interactions to best operate in the tasks that we perform.
This project will be the first systematic investigation of multisensory perceptual learning. It will also be the first study of changes in interaction between brain areas that may occur as a result of sensory learning. Altogether, this study promises to provide foundational knowledge regarding the brain mechanisms involved in multisensory learning as well as the mechanisms of learning in general. Understanding multisensory learning can contribute to the development of more effective strategies for learning. These strategies can be utilized to enhance learning for typically-developed children and adults, as well as to facilitate learning and communication for individuals with deprivation in one sense (e.g., individuals with low-vision or low-hearing, patients with cochlear implants or undergoing macular degeneration or cataract surgeries). They can also contribute to devising remedial programs for dyslexia, which appears to involve deficits in combining information across the senses.
Perception is multisensory by default. At any given moment the human nervous system receives information from multiple sensory modalities, and has to integrate these signals in order to achieve a unified percept of the environment. In the realm of perception research, on the other hand, perception has traditionally been viewed as a modular function with different sensory modalities operating independently. However, psychophysical and neurophysiological studies indicate strong interactions between the perceptual processes of different modalities that influence the earliest stages of perceptual processing. Here we investigated processes of multisensory learning. Results of multiple studies conducted in the project period explored different aspects of multisensory learning. In one series of studies we investigated crossmodal transfer where training a task in one modality can transfer to sensitivity improvements in another sensory modality. For example training visual orientation discrimination transfers to haptic performance on the same task. In some cases training in one sense is even more effective that training that task in the target sense. For example training on an auditory rhythm task produces better visual learning than training directly on the visual task. These studies help us understand how the different senses interact and suggest novel training approaches to target training to the sensory modality that is most appropriate to learn a given task regardless of what modality typical task performance will be based upon. In other studies we examined impacts of training multiple senses together. These studies help inform us how the senses can work together during a normal learning process. In tasks for which vision provides the clearest information (such as distance estimation) we found that training on an auditory-visual distance estimation task produced substantial shift in perceived distance estimation from just auditory cues suggesting that visual training is a useful method of training auditory distance estimation. In the other direction we found that the addition of auditory location cues during a visual therapy designed to improve acuity and contrast sensitivity contributed to substantial improvements of visual processing that transferred to real world visual tasks (such as reading and playing baseball). Together these results show that multisensory training, when properly conducted, leads to superior learning. Together, our results show that understanding multisensory learning can contribute to the development of more effective strategies for learning. These strategies can be utilized to enhance learning for typically-developed children and adults, as well as to facilitate learning and communication for individuals with sensory or cognitive deficits.