What are the relationships between the complex and constantly changing soundscapes that surround us and the electrical activity that represents them in the brain? This project brings together three groups, one experimental and two computational, to address this question at two different levels. First, mechanistic models will be developed, based on known properties of neural networks, to describe how different types of neurons cooperate to represent sounds. Second, the function of these neurons will be characterized by describing the features of sounds that they represent. These two goals will be achieved by combining experimental studies of neural responses to sounds with computational analyses to test candidate mechanisms for how sounds are represented in large cortical circuits. In addition to deeper understanding of auditory perception, this research will provide insights into general principles of cooperation between neurons within a single neural network. As such, the research has implications for understanding representation of signals in other sensory modalities as well as the general principles of neural coding in the brain. The research has a number of potential practical applications, including the design of advanced hearing aids and artificial speech recognition systems. Further, given that the altered balance between excitatory and inhibitory neurons has been implicated in a number of attention deficits and psychiatric disorders, including autism and schizophrenia, the project has potential medical relevance. The outreach component of the project will involve demonstration involving music and speech perception for K-12 students and exhibitions.

Technically, the experimental group will produce recordings of neural responses from the auditory thalamus and cortex in response to pure tones and complex sounds known as tone clouds. The tone clouds have sharp transitions like natural sounds, but with well-controlled spectral and temporal power distributions. Computational components of this project will aim to reproduce neural recordings through analytical modeling and simulations of large scale neural circuits composed of multiple cell types. The experimental and computational results will be matched not only in statistical terms, such as average dynamics of neural responses across the population, but also in terms of specific features of sounds that are encoded by different types of neurons in the network.

This award is cofunded by the Office of International Science and Engineering. Companion projects are being funded by the French National Research Agency (ANR) and the US-Israel Binational Science Foundation (BSF).

Agency
National Science Foundation (NSF)
Institute
Division of Information and Intelligent Systems (IIS)
Application #
1724421
Program Officer
Kenneth Whang
Project Start
Project End
Budget Start
2017-10-01
Budget End
2021-09-30
Support Year
Fiscal Year
2017
Total Cost
$950,000
Indirect Cost
Name
The Salk Institute for Biological Studies
Department
Type
DUNS #
City
La Jolla
State
CA
Country
United States
Zip Code
92037