Natural sensory inputs are typically complex, and often combine multiple modalities. Human speech, for example, combines auditory signals with visual cues, such as facial expressions, that inform the interpretation of the spoken words. As individual sensory pathways only provide a partial representation of the sensory information available, selecting the context-appropriate behavioral response to a multimodal stimulus often requires integrating information across modalities. How do neural circuits perform this fundamental computation? Our current understanding of sensory processing is predominantly built upon studies that have focused on single sensory modalities, working into the brain beginning from sensory receptors. As a result, we have a deep understanding of peripheral circuit computations in many different experimental contexts. However, working inward, cell-type by cell-type, has left our understanding of the circuits and computational principles that link sensation to action incomplete. Moreover, experimental strategies that focus exclusively on single sensory modalities cannot, by design, lead to insights into how the unified percepts that guide behavior can be assembled from information emerging in separate sensory processing streams. Here we leverage whole-brain imaging and advanced computational approaches to establish the fruit fly Drosophila as a model system for uncovering fundamental principles underpinning multisensory integration. This proposal has three goals. First, we will optimize whole-brain imaging in this experimental system, and use this technology to comprehensively characterize population dynamics underpinning the sensations of vision, mechanosensation and taste. Second, we will systematically quantify circuit interactions between these sensory modalities and across-animal variability, testing computational models of statistical inference, and identifying the algorithmic bases of multimodal integration. Third, we will link population dynamics to the response properties of single cell-types, providing a powerful path to characterizing circuit and synaptic mechanisms. Taken together, by developing and applying improved methods for large-scale monitoring of neural activity, combined with computational modeling and quantitative analysis, this project will greatly expand our understanding of sensory processing mechanisms across the brain.

Public Health Relevance

Information in natural environments is typically complex, and often consists of signals of multiple modalities. However, the neural circuit and computational mechanisms mediating multisensory integration remain unknown. Impairments in processing multisensory information are seen in several human disorders including Parkinson's disease and Autism spectrum disorder. This proposal will uncover general principles of sensory processing throughout the brain that will inform studies in more complex systems.

Agency
National Institute of Health (NIH)
Institute
National Institute of Neurological Disorders and Stroke (NINDS)
Type
Research Project (R01)
Project #
5R01NS110060-02
Application #
9789712
Study Section
Special Emphasis Panel (ZRG1)
Program Officer
David, Karen Kate
Project Start
2018-09-30
Project End
2023-08-31
Budget Start
2019-09-01
Budget End
2020-08-31
Support Year
2
Fiscal Year
2019
Total Cost
Indirect Cost
Name
Stanford University
Department
Neurology
Type
Schools of Medicine
DUNS #
009214214
City
Stanford
State
CA
Country
United States
Zip Code
94305