Human experience is shaped by our senses, which receive diverse inputs from our environment. These varied inputs, however, all contribute to a single integrated representation in our minds of the outside world. Neuroscientists have studied sensory perception, and the integration different sensory modalities like vision, hearing, and touch, for decades, but there are still important unanswered questions about how sensory cells in the brain work, and how the circuits that they form control the flow of information through the brain. Understanding these networks in detail would expand our knowledge of the brain in general, and would serve as a starting point for addressing disorders like autism spectrum disorder, in which sensory systems function abnormally. In a large part, these functioning sensory circuits have remained mysterious for technical reasons. Traditional techniques either sample from the whole brain without seeing the individual neurons or record from a few neurons at a time without seeing the larger networks. This project addresses this gap using the zebrafish model system by imaging activity across the entire brain, including the activity of each neuron individually.
The first aim contains a plan to map such brain-wide activity while the brain perceives and processes visual or auditory information, using a set of novel approaches for sensory stimulation. Experiments in the second aim will present various visual and auditory stimuli simultaneously, looking for the ways in which the brain's functioning networks integrate this information. These data will be used to build mathematical network models of how the brain processes and integrates vision and hearing.
The third aim will shed light on functional and structural aspects of these sensory neurons, revealing biological realities that we will use to adjust the purely mathematical models from the prior aims. The resulting biologically grounded models will set the stage for concrete functional experiments in the fourth aim, where we will optogenetically activate or silence specific circuit elements to test the models' predictions. The overall goal is to describe, for the first time, all of the auditory and visual neurons in the brain, and to develop and test models for how they receive, process, and integrate information across sensory modalities.
Humans and other animals are constantly perceiving and interpreting events from their surroundings and using this information to choose appropriate behaviors. Exactly how the brain's neurons process sensory information and integrate information from different senses, however, remains unclear. Our work involves monitoring all of the brain's neurons simultaneously while the senses are stimulated, providing detailed and complete maps of the circuits that detect, process, and integrate sensory information.